00:00:00.001 Started by upstream project "autotest-per-patch" build number 122846 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.011 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.012 The recommended git tool is: git 00:00:00.012 using credential 00000000-0000-0000-0000-000000000002 00:00:00.014 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.029 Fetching changes from the remote Git repository 00:00:00.030 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.047 Using shallow fetch with depth 1 00:00:00.047 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.047 > git --version # timeout=10 00:00:00.070 > git --version # 'git version 2.39.2' 00:00:00.070 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.071 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.071 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.276 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.287 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.300 Checking out Revision 10da8f6d99838e411e4e94523ded0bfebf3e7100 (FETCH_HEAD) 00:00:02.300 > git config core.sparsecheckout # timeout=10 00:00:02.311 > git read-tree -mu HEAD # timeout=10 00:00:02.331 > git checkout -f 10da8f6d99838e411e4e94523ded0bfebf3e7100 # timeout=5 00:00:02.348 Commit message: "scripts/create_git_mirror: Update path to xnvme submodule" 00:00:02.349 > git rev-list --no-walk 10da8f6d99838e411e4e94523ded0bfebf3e7100 # timeout=10 00:00:02.523 [Pipeline] Start of Pipeline 00:00:02.545 [Pipeline] library 00:00:02.547 Loading library shm_lib@master 00:00:02.547 Library shm_lib@master is cached. Copying from home. 00:00:02.572 [Pipeline] node 00:00:02.589 Running on GP12 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.595 [Pipeline] { 00:00:02.628 [Pipeline] catchError 00:00:02.630 [Pipeline] { 00:00:02.643 [Pipeline] wrap 00:00:02.652 [Pipeline] { 00:00:02.658 [Pipeline] stage 00:00:02.659 [Pipeline] { (Prologue) 00:00:02.890 [Pipeline] sh 00:00:03.164 + logger -p user.info -t JENKINS-CI 00:00:03.187 [Pipeline] echo 00:00:03.190 Node: GP12 00:00:03.198 [Pipeline] sh 00:00:03.490 [Pipeline] setCustomBuildProperty 00:00:03.501 [Pipeline] echo 00:00:03.502 Cleanup processes 00:00:03.507 [Pipeline] sh 00:00:03.785 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.785 3692104 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.798 [Pipeline] sh 00:00:04.071 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.071 ++ grep -v 'sudo pgrep' 00:00:04.071 ++ awk '{print $1}' 00:00:04.071 + sudo kill -9 00:00:04.071 + true 00:00:04.084 [Pipeline] cleanWs 00:00:04.092 [WS-CLEANUP] Deleting project workspace... 00:00:04.092 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.098 [WS-CLEANUP] done 00:00:04.102 [Pipeline] setCustomBuildProperty 00:00:04.116 [Pipeline] sh 00:00:04.389 + sudo git config --global --replace-all safe.directory '*' 00:00:04.441 [Pipeline] nodesByLabel 00:00:04.442 Found a total of 1 nodes with the 'sorcerer' label 00:00:04.452 [Pipeline] httpRequest 00:00:04.456 HttpMethod: GET 00:00:04.457 URL: http://10.211.164.101/packages/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:04.459 Sending request to url: http://10.211.164.101/packages/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:04.462 Response Code: HTTP/1.1 200 OK 00:00:04.463 Success: Status code 200 is in the accepted range: 200,404 00:00:04.463 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:04.601 [Pipeline] sh 00:00:04.880 + tar --no-same-owner -xf jbp_10da8f6d99838e411e4e94523ded0bfebf3e7100.tar.gz 00:00:04.898 [Pipeline] httpRequest 00:00:04.902 HttpMethod: GET 00:00:04.903 URL: http://10.211.164.101/packages/spdk_2dc74a001856d1e04b15939137e0bb63d27e8571.tar.gz 00:00:04.904 Sending request to url: http://10.211.164.101/packages/spdk_2dc74a001856d1e04b15939137e0bb63d27e8571.tar.gz 00:00:04.906 Response Code: HTTP/1.1 200 OK 00:00:04.907 Success: Status code 200 is in the accepted range: 200,404 00:00:04.908 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_2dc74a001856d1e04b15939137e0bb63d27e8571.tar.gz 00:00:22.043 [Pipeline] sh 00:00:22.323 + tar --no-same-owner -xf spdk_2dc74a001856d1e04b15939137e0bb63d27e8571.tar.gz 00:00:24.860 [Pipeline] sh 00:00:25.133 + git -C spdk log --oneline -n5 00:00:25.133 2dc74a001 raid: free base bdev earlier during removal 00:00:25.133 6518a98df raid: remove base_bdev_lock 00:00:25.133 96aff3c95 raid: fix some issues in raid_bdev_write_config_json() 00:00:25.133 f9cccaa84 raid: examine other bdevs when starting from superblock 00:00:25.133 688de1b9f raid: factor out a function to get a raid bdev by uuid 00:00:25.145 [Pipeline] } 00:00:25.161 [Pipeline] // stage 00:00:25.169 [Pipeline] stage 00:00:25.170 [Pipeline] { (Prepare) 00:00:25.185 [Pipeline] writeFile 00:00:25.197 [Pipeline] sh 00:00:25.469 + logger -p user.info -t JENKINS-CI 00:00:25.480 [Pipeline] sh 00:00:25.757 + logger -p user.info -t JENKINS-CI 00:00:25.768 [Pipeline] sh 00:00:26.045 + cat autorun-spdk.conf 00:00:26.045 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:26.045 SPDK_TEST_BLOCKDEV=1 00:00:26.045 SPDK_TEST_ISAL=1 00:00:26.045 SPDK_TEST_CRYPTO=1 00:00:26.045 SPDK_TEST_REDUCE=1 00:00:26.045 SPDK_TEST_VBDEV_COMPRESS=1 00:00:26.045 SPDK_RUN_UBSAN=1 00:00:26.050 RUN_NIGHTLY=0 00:00:26.056 [Pipeline] readFile 00:00:26.078 [Pipeline] withEnv 00:00:26.080 [Pipeline] { 00:00:26.094 [Pipeline] sh 00:00:26.369 + set -ex 00:00:26.369 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:26.369 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:26.369 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:26.369 ++ SPDK_TEST_BLOCKDEV=1 00:00:26.369 ++ SPDK_TEST_ISAL=1 00:00:26.369 ++ SPDK_TEST_CRYPTO=1 00:00:26.369 ++ SPDK_TEST_REDUCE=1 00:00:26.369 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:26.369 ++ SPDK_RUN_UBSAN=1 00:00:26.369 ++ RUN_NIGHTLY=0 00:00:26.369 + case $SPDK_TEST_NVMF_NICS in 00:00:26.369 + DRIVERS= 00:00:26.369 + [[ -n '' ]] 00:00:26.369 + exit 0 00:00:26.379 [Pipeline] } 00:00:26.400 [Pipeline] // withEnv 00:00:26.406 [Pipeline] } 00:00:26.422 [Pipeline] // stage 00:00:26.429 [Pipeline] catchError 00:00:26.431 [Pipeline] { 00:00:26.444 [Pipeline] timeout 00:00:26.444 Timeout set to expire in 40 min 00:00:26.446 [Pipeline] { 00:00:26.463 [Pipeline] stage 00:00:26.465 [Pipeline] { (Tests) 00:00:26.482 [Pipeline] sh 00:00:26.760 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:26.760 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:26.760 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:26.760 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:26.760 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:26.760 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:26.760 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:26.760 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:26.760 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:26.760 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:26.760 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:26.760 + source /etc/os-release 00:00:26.760 ++ NAME='Fedora Linux' 00:00:26.760 ++ VERSION='38 (Cloud Edition)' 00:00:26.760 ++ ID=fedora 00:00:26.760 ++ VERSION_ID=38 00:00:26.760 ++ VERSION_CODENAME= 00:00:26.760 ++ PLATFORM_ID=platform:f38 00:00:26.760 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:26.760 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:26.760 ++ LOGO=fedora-logo-icon 00:00:26.760 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:26.760 ++ HOME_URL=https://fedoraproject.org/ 00:00:26.760 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:26.760 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:26.760 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:26.760 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:26.760 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:26.760 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:26.760 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:26.760 ++ SUPPORT_END=2024-05-14 00:00:26.760 ++ VARIANT='Cloud Edition' 00:00:26.760 ++ VARIANT_ID=cloud 00:00:26.760 + uname -a 00:00:26.760 Linux spdk-gp-12 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:00:26.760 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:27.696 Hugepages 00:00:27.696 node hugesize free / total 00:00:27.696 node0 1048576kB 0 / 0 00:00:27.696 node0 2048kB 0 / 0 00:00:27.696 node1 1048576kB 0 / 0 00:00:27.696 node1 2048kB 0 / 0 00:00:27.696 00:00:27.696 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:27.953 I/OAT 0000:00:04.0 8086 0e20 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.1 8086 0e21 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.2 8086 0e22 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.3 8086 0e23 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.4 8086 0e24 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.5 8086 0e25 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.6 8086 0e26 0 ioatdma - - 00:00:27.953 I/OAT 0000:00:04.7 8086 0e27 0 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.0 8086 0e20 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.1 8086 0e21 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.2 8086 0e22 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.3 8086 0e23 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.4 8086 0e24 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.5 8086 0e25 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.6 8086 0e26 1 ioatdma - - 00:00:27.953 I/OAT 0000:80:04.7 8086 0e27 1 ioatdma - - 00:00:27.953 NVMe 0000:81:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:00:27.953 + rm -f /tmp/spdk-ld-path 00:00:27.953 + source autorun-spdk.conf 00:00:27.953 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:27.953 ++ SPDK_TEST_BLOCKDEV=1 00:00:27.953 ++ SPDK_TEST_ISAL=1 00:00:27.953 ++ SPDK_TEST_CRYPTO=1 00:00:27.953 ++ SPDK_TEST_REDUCE=1 00:00:27.953 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:27.953 ++ SPDK_RUN_UBSAN=1 00:00:27.953 ++ RUN_NIGHTLY=0 00:00:27.953 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:27.953 + [[ -n '' ]] 00:00:27.954 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:27.954 + for M in /var/spdk/build-*-manifest.txt 00:00:27.954 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:27.954 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:27.954 + for M in /var/spdk/build-*-manifest.txt 00:00:27.954 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:27.954 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:27.954 ++ uname 00:00:27.954 + [[ Linux == \L\i\n\u\x ]] 00:00:27.954 + sudo dmesg -T 00:00:27.954 + sudo dmesg --clear 00:00:27.954 + dmesg_pid=3692866 00:00:27.954 + [[ Fedora Linux == FreeBSD ]] 00:00:27.954 + sudo dmesg -Tw 00:00:27.954 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:27.954 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:27.954 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:27.954 + [[ -x /usr/src/fio-static/fio ]] 00:00:27.954 + export FIO_BIN=/usr/src/fio-static/fio 00:00:27.954 + FIO_BIN=/usr/src/fio-static/fio 00:00:27.954 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:27.954 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:27.954 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:27.954 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:27.954 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:27.954 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:27.954 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:27.954 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:27.954 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:27.954 Test configuration: 00:00:27.954 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:27.954 SPDK_TEST_BLOCKDEV=1 00:00:27.954 SPDK_TEST_ISAL=1 00:00:27.954 SPDK_TEST_CRYPTO=1 00:00:27.954 SPDK_TEST_REDUCE=1 00:00:27.954 SPDK_TEST_VBDEV_COMPRESS=1 00:00:27.954 SPDK_RUN_UBSAN=1 00:00:27.954 RUN_NIGHTLY=0 04:01:15 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:27.954 04:01:15 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:27.954 04:01:15 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:27.954 04:01:15 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:27.954 04:01:15 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.954 04:01:15 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.954 04:01:15 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.954 04:01:15 -- paths/export.sh@5 -- $ export PATH 00:00:27.954 04:01:15 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:27.954 04:01:15 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:27.954 04:01:15 -- common/autobuild_common.sh@437 -- $ date +%s 00:00:27.954 04:01:15 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715738475.XXXXXX 00:00:27.954 04:01:15 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715738475.nWMKJJ 00:00:27.954 04:01:15 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:00:27.954 04:01:15 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:00:27.954 04:01:15 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:27.954 04:01:15 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:27.954 04:01:15 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:27.954 04:01:15 -- common/autobuild_common.sh@453 -- $ get_config_params 00:00:27.954 04:01:15 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:00:27.954 04:01:15 -- common/autotest_common.sh@10 -- $ set +x 00:00:28.212 04:01:15 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:28.212 04:01:15 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:00:28.212 04:01:15 -- pm/common@17 -- $ local monitor 00:00:28.212 04:01:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.212 04:01:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.212 04:01:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.212 04:01:15 -- pm/common@21 -- $ date +%s 00:00:28.212 04:01:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:28.212 04:01:15 -- pm/common@21 -- $ date +%s 00:00:28.212 04:01:15 -- pm/common@25 -- $ sleep 1 00:00:28.212 04:01:15 -- pm/common@21 -- $ date +%s 00:00:28.212 04:01:15 -- pm/common@21 -- $ date +%s 00:00:28.212 04:01:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715738475 00:00:28.212 04:01:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715738475 00:00:28.212 04:01:15 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715738475 00:00:28.212 04:01:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1715738475 00:00:28.212 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715738475_collect-vmstat.pm.log 00:00:28.212 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715738475_collect-cpu-load.pm.log 00:00:28.212 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715738475_collect-cpu-temp.pm.log 00:00:28.212 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1715738475_collect-bmc-pm.bmc.pm.log 00:00:29.144 04:01:16 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:00:29.144 04:01:16 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:29.144 04:01:16 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:29.144 04:01:16 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:29.144 04:01:16 -- spdk/autobuild.sh@16 -- $ date -u 00:00:29.144 Wed May 15 02:01:16 AM UTC 2024 00:00:29.144 04:01:16 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:29.144 v24.05-pre-653-g2dc74a001 00:00:29.144 04:01:16 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:29.144 04:01:16 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:29.144 04:01:16 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:29.144 04:01:16 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:00:29.144 04:01:16 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:29.144 04:01:16 -- common/autotest_common.sh@10 -- $ set +x 00:00:29.144 ************************************ 00:00:29.144 START TEST ubsan 00:00:29.144 ************************************ 00:00:29.144 04:01:17 ubsan -- common/autotest_common.sh@1121 -- $ echo 'using ubsan' 00:00:29.144 using ubsan 00:00:29.144 00:00:29.144 real 0m0.000s 00:00:29.144 user 0m0.000s 00:00:29.144 sys 0m0.000s 00:00:29.144 04:01:17 ubsan -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:00:29.144 04:01:17 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:29.144 ************************************ 00:00:29.144 END TEST ubsan 00:00:29.144 ************************************ 00:00:29.144 04:01:17 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:29.144 04:01:17 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:29.144 04:01:17 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:29.144 04:01:17 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:29.144 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:29.144 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:29.401 Using 'verbs' RDMA provider 00:00:40.317 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:00:50.289 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:00:50.289 Creating mk/config.mk...done. 00:00:50.289 Creating mk/cc.flags.mk...done. 00:00:50.289 Type 'make' to build. 00:00:50.289 04:01:37 -- spdk/autobuild.sh@69 -- $ run_test make make -j48 00:00:50.289 04:01:37 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:00:50.289 04:01:37 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:50.289 04:01:37 -- common/autotest_common.sh@10 -- $ set +x 00:00:50.289 ************************************ 00:00:50.289 START TEST make 00:00:50.289 ************************************ 00:00:50.289 04:01:37 make -- common/autotest_common.sh@1121 -- $ make -j48 00:00:50.289 make[1]: Nothing to be done for 'all'. 00:01:37.022 The Meson build system 00:01:37.022 Version: 1.3.1 00:01:37.022 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:37.022 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:37.022 Build type: native build 00:01:37.022 Program cat found: YES (/usr/bin/cat) 00:01:37.022 Project name: DPDK 00:01:37.022 Project version: 23.11.0 00:01:37.022 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:37.023 C linker for the host machine: cc ld.bfd 2.39-16 00:01:37.023 Host machine cpu family: x86_64 00:01:37.023 Host machine cpu: x86_64 00:01:37.023 Message: ## Building in Developer Mode ## 00:01:37.023 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:37.023 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:37.023 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:37.023 Program python3 found: YES (/usr/bin/python3) 00:01:37.023 Program cat found: YES (/usr/bin/cat) 00:01:37.023 Compiler for C supports arguments -march=native: YES 00:01:37.023 Checking for size of "void *" : 8 00:01:37.023 Checking for size of "void *" : 8 (cached) 00:01:37.023 Library m found: YES 00:01:37.023 Library numa found: YES 00:01:37.023 Has header "numaif.h" : YES 00:01:37.023 Library fdt found: NO 00:01:37.023 Library execinfo found: NO 00:01:37.023 Has header "execinfo.h" : YES 00:01:37.023 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:37.023 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:37.023 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:37.023 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:37.023 Run-time dependency openssl found: YES 3.0.9 00:01:37.023 Run-time dependency libpcap found: YES 1.10.4 00:01:37.023 Has header "pcap.h" with dependency libpcap: YES 00:01:37.023 Compiler for C supports arguments -Wcast-qual: YES 00:01:37.023 Compiler for C supports arguments -Wdeprecated: YES 00:01:37.023 Compiler for C supports arguments -Wformat: YES 00:01:37.023 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:37.023 Compiler for C supports arguments -Wformat-security: NO 00:01:37.023 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:37.023 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:37.023 Compiler for C supports arguments -Wnested-externs: YES 00:01:37.023 Compiler for C supports arguments -Wold-style-definition: YES 00:01:37.023 Compiler for C supports arguments -Wpointer-arith: YES 00:01:37.023 Compiler for C supports arguments -Wsign-compare: YES 00:01:37.023 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:37.023 Compiler for C supports arguments -Wundef: YES 00:01:37.023 Compiler for C supports arguments -Wwrite-strings: YES 00:01:37.023 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:37.023 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:37.023 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:37.023 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:37.023 Program objdump found: YES (/usr/bin/objdump) 00:01:37.023 Compiler for C supports arguments -mavx512f: YES 00:01:37.023 Checking if "AVX512 checking" compiles: YES 00:01:37.023 Fetching value of define "__SSE4_2__" : 1 00:01:37.023 Fetching value of define "__AES__" : 1 00:01:37.023 Fetching value of define "__AVX__" : 1 00:01:37.023 Fetching value of define "__AVX2__" : (undefined) 00:01:37.023 Fetching value of define "__AVX512BW__" : (undefined) 00:01:37.023 Fetching value of define "__AVX512CD__" : (undefined) 00:01:37.023 Fetching value of define "__AVX512DQ__" : (undefined) 00:01:37.023 Fetching value of define "__AVX512F__" : (undefined) 00:01:37.023 Fetching value of define "__AVX512VL__" : (undefined) 00:01:37.023 Fetching value of define "__PCLMUL__" : 1 00:01:37.023 Fetching value of define "__RDRND__" : 1 00:01:37.023 Fetching value of define "__RDSEED__" : (undefined) 00:01:37.023 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:37.023 Fetching value of define "__znver1__" : (undefined) 00:01:37.023 Fetching value of define "__znver2__" : (undefined) 00:01:37.023 Fetching value of define "__znver3__" : (undefined) 00:01:37.023 Fetching value of define "__znver4__" : (undefined) 00:01:37.023 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:37.023 Message: lib/log: Defining dependency "log" 00:01:37.023 Message: lib/kvargs: Defining dependency "kvargs" 00:01:37.023 Message: lib/telemetry: Defining dependency "telemetry" 00:01:37.023 Checking for function "getentropy" : NO 00:01:37.023 Message: lib/eal: Defining dependency "eal" 00:01:37.023 Message: lib/ring: Defining dependency "ring" 00:01:37.023 Message: lib/rcu: Defining dependency "rcu" 00:01:37.023 Message: lib/mempool: Defining dependency "mempool" 00:01:37.023 Message: lib/mbuf: Defining dependency "mbuf" 00:01:37.023 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:37.023 Fetching value of define "__AVX512F__" : (undefined) (cached) 00:01:37.023 Compiler for C supports arguments -mpclmul: YES 00:01:37.023 Compiler for C supports arguments -maes: YES 00:01:37.023 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:37.023 Compiler for C supports arguments -mavx512bw: YES 00:01:37.023 Compiler for C supports arguments -mavx512dq: YES 00:01:37.023 Compiler for C supports arguments -mavx512vl: YES 00:01:37.023 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:37.023 Compiler for C supports arguments -mavx2: YES 00:01:37.023 Compiler for C supports arguments -mavx: YES 00:01:37.023 Message: lib/net: Defining dependency "net" 00:01:37.023 Message: lib/meter: Defining dependency "meter" 00:01:37.023 Message: lib/ethdev: Defining dependency "ethdev" 00:01:37.023 Message: lib/pci: Defining dependency "pci" 00:01:37.023 Message: lib/cmdline: Defining dependency "cmdline" 00:01:37.023 Message: lib/hash: Defining dependency "hash" 00:01:37.023 Message: lib/timer: Defining dependency "timer" 00:01:37.023 Message: lib/compressdev: Defining dependency "compressdev" 00:01:37.023 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:37.023 Message: lib/dmadev: Defining dependency "dmadev" 00:01:37.023 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:37.023 Message: lib/power: Defining dependency "power" 00:01:37.023 Message: lib/reorder: Defining dependency "reorder" 00:01:37.023 Message: lib/security: Defining dependency "security" 00:01:37.023 Has header "linux/userfaultfd.h" : YES 00:01:37.023 Has header "linux/vduse.h" : YES 00:01:37.023 Message: lib/vhost: Defining dependency "vhost" 00:01:37.023 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:37.023 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:37.023 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:37.023 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:37.023 Compiler for C supports arguments -std=c11: YES 00:01:37.023 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:37.023 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:37.023 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:37.023 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:37.023 Run-time dependency libmlx5 found: YES 1.24.44.0 00:01:37.023 Run-time dependency libibverbs found: YES 1.14.44.0 00:01:37.023 Library mtcr_ul found: NO 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:37.023 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_40000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseKR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseCR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseSR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "SUPPORTED_56000baseLR4_Full" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_25000baseCR_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_50000baseCR2_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/ethtool.h" has symbol "ETHTOOL_LINK_MODE_100000baseKR4_Full_BIT" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:37.024 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:37.024 Configuring mlx5_autoconf.h using configuration 00:01:37.024 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:37.024 Run-time dependency libcrypto found: YES 3.0.9 00:01:37.024 Library IPSec_MB found: YES 00:01:37.024 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:37.024 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:37.024 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:37.024 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:37.024 Library IPSec_MB found: YES 00:01:37.024 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:37.024 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:37.024 Compiler for C supports arguments -std=c11: YES (cached) 00:01:37.024 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:37.024 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:37.024 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:37.024 Library libisal found: NO 00:01:37.024 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:37.024 Compiler for C supports arguments -std=c11: YES (cached) 00:01:37.024 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:37.024 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:37.024 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:37.024 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:37.024 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:37.024 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:37.024 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:37.024 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:37.024 Program doxygen found: YES (/usr/bin/doxygen) 00:01:37.024 Configuring doxy-api-html.conf using configuration 00:01:37.024 Configuring doxy-api-man.conf using configuration 00:01:37.024 Program mandb found: YES (/usr/bin/mandb) 00:01:37.024 Program sphinx-build found: NO 00:01:37.024 Configuring rte_build_config.h using configuration 00:01:37.024 Message: 00:01:37.024 ================= 00:01:37.024 Applications Enabled 00:01:37.024 ================= 00:01:37.024 00:01:37.024 apps: 00:01:37.024 00:01:37.024 00:01:37.024 Message: 00:01:37.024 ================= 00:01:37.024 Libraries Enabled 00:01:37.024 ================= 00:01:37.024 00:01:37.024 libs: 00:01:37.024 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:37.024 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:37.024 cryptodev, dmadev, power, reorder, security, vhost, 00:01:37.024 00:01:37.024 Message: 00:01:37.024 =============== 00:01:37.024 Drivers Enabled 00:01:37.024 =============== 00:01:37.024 00:01:37.024 common: 00:01:37.024 mlx5, qat, 00:01:37.024 bus: 00:01:37.024 auxiliary, pci, vdev, 00:01:37.024 mempool: 00:01:37.024 ring, 00:01:37.024 dma: 00:01:37.024 00:01:37.024 net: 00:01:37.024 00:01:37.024 crypto: 00:01:37.024 ipsec_mb, mlx5, 00:01:37.024 compress: 00:01:37.024 isal, mlx5, 00:01:37.024 vdpa: 00:01:37.024 00:01:37.024 00:01:37.024 Message: 00:01:37.024 ================= 00:01:37.024 Content Skipped 00:01:37.024 ================= 00:01:37.024 00:01:37.024 apps: 00:01:37.024 dumpcap: explicitly disabled via build config 00:01:37.024 graph: explicitly disabled via build config 00:01:37.024 pdump: explicitly disabled via build config 00:01:37.024 proc-info: explicitly disabled via build config 00:01:37.024 test-acl: explicitly disabled via build config 00:01:37.024 test-bbdev: explicitly disabled via build config 00:01:37.024 test-cmdline: explicitly disabled via build config 00:01:37.024 test-compress-perf: explicitly disabled via build config 00:01:37.024 test-crypto-perf: explicitly disabled via build config 00:01:37.024 test-dma-perf: explicitly disabled via build config 00:01:37.024 test-eventdev: explicitly disabled via build config 00:01:37.024 test-fib: explicitly disabled via build config 00:01:37.024 test-flow-perf: explicitly disabled via build config 00:01:37.024 test-gpudev: explicitly disabled via build config 00:01:37.024 test-mldev: explicitly disabled via build config 00:01:37.024 test-pipeline: explicitly disabled via build config 00:01:37.024 test-pmd: explicitly disabled via build config 00:01:37.024 test-regex: explicitly disabled via build config 00:01:37.024 test-sad: explicitly disabled via build config 00:01:37.024 test-security-perf: explicitly disabled via build config 00:01:37.024 00:01:37.024 libs: 00:01:37.024 metrics: explicitly disabled via build config 00:01:37.024 acl: explicitly disabled via build config 00:01:37.024 bbdev: explicitly disabled via build config 00:01:37.024 bitratestats: explicitly disabled via build config 00:01:37.024 bpf: explicitly disabled via build config 00:01:37.024 cfgfile: explicitly disabled via build config 00:01:37.024 distributor: explicitly disabled via build config 00:01:37.024 efd: explicitly disabled via build config 00:01:37.024 eventdev: explicitly disabled via build config 00:01:37.024 dispatcher: explicitly disabled via build config 00:01:37.024 gpudev: explicitly disabled via build config 00:01:37.024 gro: explicitly disabled via build config 00:01:37.024 gso: explicitly disabled via build config 00:01:37.025 ip_frag: explicitly disabled via build config 00:01:37.025 jobstats: explicitly disabled via build config 00:01:37.025 latencystats: explicitly disabled via build config 00:01:37.025 lpm: explicitly disabled via build config 00:01:37.025 member: explicitly disabled via build config 00:01:37.025 pcapng: explicitly disabled via build config 00:01:37.025 rawdev: explicitly disabled via build config 00:01:37.025 regexdev: explicitly disabled via build config 00:01:37.025 mldev: explicitly disabled via build config 00:01:37.025 rib: explicitly disabled via build config 00:01:37.025 sched: explicitly disabled via build config 00:01:37.025 stack: explicitly disabled via build config 00:01:37.025 ipsec: explicitly disabled via build config 00:01:37.025 pdcp: explicitly disabled via build config 00:01:37.025 fib: explicitly disabled via build config 00:01:37.025 port: explicitly disabled via build config 00:01:37.025 pdump: explicitly disabled via build config 00:01:37.025 table: explicitly disabled via build config 00:01:37.025 pipeline: explicitly disabled via build config 00:01:37.025 graph: explicitly disabled via build config 00:01:37.025 node: explicitly disabled via build config 00:01:37.025 00:01:37.025 drivers: 00:01:37.025 common/cpt: not in enabled drivers build config 00:01:37.025 common/dpaax: not in enabled drivers build config 00:01:37.025 common/iavf: not in enabled drivers build config 00:01:37.025 common/idpf: not in enabled drivers build config 00:01:37.025 common/mvep: not in enabled drivers build config 00:01:37.025 common/octeontx: not in enabled drivers build config 00:01:37.025 bus/cdx: not in enabled drivers build config 00:01:37.025 bus/dpaa: not in enabled drivers build config 00:01:37.025 bus/fslmc: not in enabled drivers build config 00:01:37.025 bus/ifpga: not in enabled drivers build config 00:01:37.025 bus/platform: not in enabled drivers build config 00:01:37.025 bus/vmbus: not in enabled drivers build config 00:01:37.025 common/cnxk: not in enabled drivers build config 00:01:37.025 common/nfp: not in enabled drivers build config 00:01:37.025 common/sfc_efx: not in enabled drivers build config 00:01:37.025 mempool/bucket: not in enabled drivers build config 00:01:37.025 mempool/cnxk: not in enabled drivers build config 00:01:37.025 mempool/dpaa: not in enabled drivers build config 00:01:37.025 mempool/dpaa2: not in enabled drivers build config 00:01:37.025 mempool/octeontx: not in enabled drivers build config 00:01:37.025 mempool/stack: not in enabled drivers build config 00:01:37.025 dma/cnxk: not in enabled drivers build config 00:01:37.025 dma/dpaa: not in enabled drivers build config 00:01:37.025 dma/dpaa2: not in enabled drivers build config 00:01:37.025 dma/hisilicon: not in enabled drivers build config 00:01:37.025 dma/idxd: not in enabled drivers build config 00:01:37.025 dma/ioat: not in enabled drivers build config 00:01:37.025 dma/skeleton: not in enabled drivers build config 00:01:37.025 net/af_packet: not in enabled drivers build config 00:01:37.025 net/af_xdp: not in enabled drivers build config 00:01:37.025 net/ark: not in enabled drivers build config 00:01:37.025 net/atlantic: not in enabled drivers build config 00:01:37.025 net/avp: not in enabled drivers build config 00:01:37.025 net/axgbe: not in enabled drivers build config 00:01:37.025 net/bnx2x: not in enabled drivers build config 00:01:37.025 net/bnxt: not in enabled drivers build config 00:01:37.025 net/bonding: not in enabled drivers build config 00:01:37.025 net/cnxk: not in enabled drivers build config 00:01:37.025 net/cpfl: not in enabled drivers build config 00:01:37.025 net/cxgbe: not in enabled drivers build config 00:01:37.025 net/dpaa: not in enabled drivers build config 00:01:37.025 net/dpaa2: not in enabled drivers build config 00:01:37.025 net/e1000: not in enabled drivers build config 00:01:37.025 net/ena: not in enabled drivers build config 00:01:37.025 net/enetc: not in enabled drivers build config 00:01:37.025 net/enetfec: not in enabled drivers build config 00:01:37.025 net/enic: not in enabled drivers build config 00:01:37.025 net/failsafe: not in enabled drivers build config 00:01:37.025 net/fm10k: not in enabled drivers build config 00:01:37.025 net/gve: not in enabled drivers build config 00:01:37.025 net/hinic: not in enabled drivers build config 00:01:37.025 net/hns3: not in enabled drivers build config 00:01:37.025 net/i40e: not in enabled drivers build config 00:01:37.025 net/iavf: not in enabled drivers build config 00:01:37.025 net/ice: not in enabled drivers build config 00:01:37.025 net/idpf: not in enabled drivers build config 00:01:37.025 net/igc: not in enabled drivers build config 00:01:37.025 net/ionic: not in enabled drivers build config 00:01:37.025 net/ipn3ke: not in enabled drivers build config 00:01:37.025 net/ixgbe: not in enabled drivers build config 00:01:37.025 net/mana: not in enabled drivers build config 00:01:37.025 net/memif: not in enabled drivers build config 00:01:37.025 net/mlx4: not in enabled drivers build config 00:01:37.025 net/mlx5: not in enabled drivers build config 00:01:37.025 net/mvneta: not in enabled drivers build config 00:01:37.025 net/mvpp2: not in enabled drivers build config 00:01:37.025 net/netvsc: not in enabled drivers build config 00:01:37.025 net/nfb: not in enabled drivers build config 00:01:37.025 net/nfp: not in enabled drivers build config 00:01:37.025 net/ngbe: not in enabled drivers build config 00:01:37.025 net/null: not in enabled drivers build config 00:01:37.025 net/octeontx: not in enabled drivers build config 00:01:37.025 net/octeon_ep: not in enabled drivers build config 00:01:37.025 net/pcap: not in enabled drivers build config 00:01:37.025 net/pfe: not in enabled drivers build config 00:01:37.025 net/qede: not in enabled drivers build config 00:01:37.025 net/ring: not in enabled drivers build config 00:01:37.025 net/sfc: not in enabled drivers build config 00:01:37.025 net/softnic: not in enabled drivers build config 00:01:37.025 net/tap: not in enabled drivers build config 00:01:37.025 net/thunderx: not in enabled drivers build config 00:01:37.025 net/txgbe: not in enabled drivers build config 00:01:37.025 net/vdev_netvsc: not in enabled drivers build config 00:01:37.025 net/vhost: not in enabled drivers build config 00:01:37.025 net/virtio: not in enabled drivers build config 00:01:37.025 net/vmxnet3: not in enabled drivers build config 00:01:37.025 raw/*: missing internal dependency, "rawdev" 00:01:37.025 crypto/armv8: not in enabled drivers build config 00:01:37.025 crypto/bcmfs: not in enabled drivers build config 00:01:37.025 crypto/caam_jr: not in enabled drivers build config 00:01:37.025 crypto/ccp: not in enabled drivers build config 00:01:37.025 crypto/cnxk: not in enabled drivers build config 00:01:37.025 crypto/dpaa_sec: not in enabled drivers build config 00:01:37.025 crypto/dpaa2_sec: not in enabled drivers build config 00:01:37.025 crypto/mvsam: not in enabled drivers build config 00:01:37.025 crypto/nitrox: not in enabled drivers build config 00:01:37.025 crypto/null: not in enabled drivers build config 00:01:37.025 crypto/octeontx: not in enabled drivers build config 00:01:37.025 crypto/openssl: not in enabled drivers build config 00:01:37.025 crypto/scheduler: not in enabled drivers build config 00:01:37.025 crypto/uadk: not in enabled drivers build config 00:01:37.025 crypto/virtio: not in enabled drivers build config 00:01:37.025 compress/octeontx: not in enabled drivers build config 00:01:37.025 compress/zlib: not in enabled drivers build config 00:01:37.025 regex/*: missing internal dependency, "regexdev" 00:01:37.025 ml/*: missing internal dependency, "mldev" 00:01:37.025 vdpa/ifc: not in enabled drivers build config 00:01:37.025 vdpa/mlx5: not in enabled drivers build config 00:01:37.025 vdpa/nfp: not in enabled drivers build config 00:01:37.025 vdpa/sfc: not in enabled drivers build config 00:01:37.025 event/*: missing internal dependency, "eventdev" 00:01:37.025 baseband/*: missing internal dependency, "bbdev" 00:01:37.025 gpu/*: missing internal dependency, "gpudev" 00:01:37.025 00:01:37.025 00:01:37.025 Build targets in project: 115 00:01:37.025 00:01:37.025 DPDK 23.11.0 00:01:37.025 00:01:37.025 User defined options 00:01:37.025 buildtype : debug 00:01:37.025 default_library : shared 00:01:37.025 libdir : lib 00:01:37.025 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:37.025 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:37.025 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:37.025 cpu_instruction_set: native 00:01:37.025 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:37.025 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:01:37.025 enable_docs : false 00:01:37.025 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:37.025 enable_kmods : false 00:01:37.025 tests : false 00:01:37.025 00:01:37.025 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:37.025 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:37.025 [1/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:37.025 [2/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:37.025 [3/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:37.025 [4/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:37.025 [5/370] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:37.025 [6/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:37.025 [7/370] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:37.025 [8/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:37.025 [9/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:37.025 [10/370] Linking static target lib/librte_kvargs.a 00:01:37.025 [11/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:37.026 [12/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:37.026 [13/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:37.026 [14/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:37.026 [15/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:37.026 [16/370] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:37.026 [17/370] Linking static target lib/librte_log.a 00:01:37.026 [18/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:37.026 [19/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:37.026 [20/370] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:37.026 [21/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:37.026 [22/370] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.026 [23/370] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.026 [24/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:37.026 [25/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:37.026 [26/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:37.026 [27/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:37.026 [28/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:37.026 [29/370] Linking target lib/librte_log.so.24.0 00:01:37.026 [30/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:37.026 [31/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:37.026 [32/370] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:37.026 [33/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:37.026 [34/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:37.026 [35/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:37.026 [36/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:37.026 [37/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:37.026 [38/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:37.026 [39/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:37.026 [40/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:37.026 [41/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:37.026 [42/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:37.026 [43/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:37.026 [44/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:37.026 [45/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:37.026 [46/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:37.026 [47/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:37.026 [48/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:37.026 [49/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:37.026 [50/370] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:37.026 [51/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:37.026 [52/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:37.026 [53/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:37.026 [54/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:37.026 [55/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:37.026 [56/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:37.026 [57/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:37.026 [58/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:37.026 [59/370] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:37.026 [60/370] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:37.026 [61/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:37.026 [62/370] Linking static target lib/librte_telemetry.a 00:01:37.026 [63/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:37.026 [64/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:37.026 [65/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:37.026 [66/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:37.026 [67/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:37.026 [68/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:37.026 [69/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:37.026 [70/370] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:37.026 [71/370] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:37.026 [72/370] Linking static target lib/librte_pci.a 00:01:37.026 [73/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:37.026 [74/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:37.026 [75/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:37.026 [76/370] Linking target lib/librte_kvargs.so.24.0 00:01:37.026 [77/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:37.026 [78/370] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:37.026 [79/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:37.026 [80/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:37.026 [81/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:37.026 [82/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:37.026 [83/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:37.026 [84/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:37.286 [85/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:37.286 [86/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:37.286 [87/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:37.286 [88/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:37.286 [89/370] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:37.286 [90/370] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.286 [91/370] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:37.286 [92/370] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:37.545 [93/370] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.545 [94/370] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:37.545 [95/370] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:37.545 [96/370] Linking target lib/librte_telemetry.so.24.0 00:01:37.545 [97/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:37.545 [98/370] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:37.545 [99/370] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:37.545 [100/370] Linking static target lib/librte_eal.a 00:01:37.545 [101/370] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:37.545 [102/370] Linking static target lib/librte_ring.a 00:01:37.545 [103/370] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:37.812 [104/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:37.812 [105/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:37.812 [106/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:37.812 [107/370] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:37.812 [108/370] Linking static target lib/librte_meter.a 00:01:37.812 [109/370] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:37.812 [110/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:37.812 [111/370] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:37.812 [112/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:37.812 [113/370] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:37.812 [114/370] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:37.812 [115/370] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:37.812 [116/370] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:37.812 [117/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:37.812 [118/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:37.812 [119/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:37.812 [120/370] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:37.812 [121/370] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:37.812 [122/370] Linking static target lib/librte_mempool.a 00:01:37.812 [123/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:37.812 [124/370] Linking static target lib/librte_rcu.a 00:01:37.812 [125/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:38.077 [126/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:38.077 [127/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:38.077 [128/370] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:38.077 [129/370] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:38.077 [130/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:38.077 [131/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:38.077 [132/370] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:38.077 [133/370] Linking static target lib/librte_cmdline.a 00:01:38.077 [134/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:38.077 [135/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:38.077 [136/370] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:38.077 [137/370] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:38.077 [138/370] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.077 [139/370] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.077 [140/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:38.077 [141/370] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:38.077 [142/370] Linking static target lib/librte_timer.a 00:01:38.077 [143/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:38.077 [144/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:38.341 [145/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:38.341 [146/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:38.341 [147/370] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:38.341 [148/370] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:38.341 [149/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:38.341 [150/370] Linking static target lib/librte_net.a 00:01:38.341 [151/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:38.341 [152/370] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.341 [153/370] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:38.612 [154/370] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:38.612 [155/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:38.612 [156/370] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:38.612 [157/370] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:38.612 [158/370] Linking static target lib/librte_dmadev.a 00:01:38.612 [159/370] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:38.613 [160/370] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:38.613 [161/370] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.613 [162/370] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.895 [163/370] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:38.895 [164/370] Linking static target lib/librte_compressdev.a 00:01:38.895 [165/370] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:38.895 [166/370] Linking static target lib/librte_hash.a 00:01:38.895 [167/370] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:38.895 [168/370] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:38.895 [169/370] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.895 [170/370] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:38.895 [171/370] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:38.895 [172/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:38.895 [173/370] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:38.895 [174/370] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:38.895 [175/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:38.895 [176/370] Linking static target lib/librte_power.a 00:01:38.895 [177/370] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:38.895 [178/370] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:38.895 [179/370] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:38.895 [180/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:39.176 [181/370] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:39.176 [182/370] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:39.176 [183/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:39.176 [184/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:39.176 [185/370] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:39.176 [186/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:39.176 [187/370] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:39.176 [188/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:39.176 [189/370] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:39.176 [190/370] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:39.176 [191/370] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.176 [192/370] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.176 [193/370] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:39.449 [194/370] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:39.449 [195/370] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:39.449 [196/370] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:39.449 [197/370] Linking static target lib/librte_reorder.a 00:01:39.449 [198/370] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:39.449 [199/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:39.449 [200/370] Linking static target drivers/librte_bus_auxiliary.a 00:01:39.449 [201/370] Compiling C object drivers/librte_bus_auxiliary.so.24.0.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:39.449 [202/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:39.449 [203/370] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:39.449 [204/370] Linking static target lib/librte_mbuf.a 00:01:39.449 [205/370] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:39.449 [206/370] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:39.449 [207/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:39.449 [208/370] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.449 [209/370] Linking static target drivers/librte_bus_vdev.a 00:01:39.715 [210/370] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:39.715 [211/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:39.715 [212/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:39.715 [213/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:39.715 [214/370] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:39.715 [215/370] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.715 [216/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:39.715 [217/370] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:39.715 [218/370] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:39.716 [219/370] Linking static target drivers/librte_bus_pci.a 00:01:39.716 [220/370] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:39.716 [221/370] Linking static target lib/librte_security.a 00:01:39.716 [222/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:39.716 [223/370] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.974 [224/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:39.974 [225/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:39.974 [226/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:39.974 [227/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:39.974 [228/370] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.974 [229/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:39.974 [230/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:39.974 [231/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:39.974 [232/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:39.974 [233/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:39.974 [234/370] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.974 [235/370] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.974 [236/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:39.974 [237/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:39.974 [238/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:39.974 [239/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:40.234 [240/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:40.234 [241/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:40.234 [242/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:40.234 [243/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:40.234 [244/370] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.234 [245/370] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.234 [246/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:40.234 [247/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:40.234 [248/370] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:40.234 [249/370] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:40.234 [250/370] Linking static target lib/librte_ethdev.a 00:01:40.234 [251/370] Linking static target lib/librte_cryptodev.a 00:01:40.234 [252/370] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:40.493 [253/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:40.493 [254/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:40.493 [255/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:40.493 [256/370] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:40.493 [257/370] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:40.493 [258/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:40.752 [259/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:40.752 [260/370] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:40.752 [261/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:40.752 [262/370] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:40.752 [263/370] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:40.752 [264/370] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:40.752 [265/370] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:40.752 [266/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:40.752 [267/370] Linking static target drivers/librte_mempool_ring.a 00:01:40.752 [268/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:01:41.011 [269/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:41.011 [270/370] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:01:41.011 [271/370] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:41.011 [272/370] Compiling C object drivers/librte_common_mlx5.so.24.0.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:41.011 [273/370] Linking static target drivers/librte_common_mlx5.a 00:01:41.011 [274/370] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:41.011 [275/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:41.011 [276/370] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:41.011 [277/370] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:01:41.269 [278/370] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:41.269 [279/370] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:41.269 [280/370] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:41.269 [281/370] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:41.269 [282/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:41.269 [283/370] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:01:41.269 [284/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:41.269 [285/370] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:41.269 [286/370] Compiling C object drivers/librte_crypto_mlx5.so.24.0.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:41.269 [287/370] Linking static target drivers/librte_crypto_mlx5.a 00:01:41.269 [288/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:41.269 [289/370] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.269 [290/370] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:41.528 [291/370] Compiling C object drivers/librte_compress_mlx5.so.24.0.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:41.528 [292/370] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:41.528 [293/370] Linking static target drivers/librte_compress_mlx5.a 00:01:41.528 [294/370] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:41.528 [295/370] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:41.528 [296/370] Compiling C object drivers/librte_compress_isal.so.24.0.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:41.528 [297/370] Linking static target drivers/librte_compress_isal.a 00:01:41.528 [298/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:41.786 [299/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:41.786 [300/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:42.044 [301/370] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:42.044 [302/370] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:42.044 [303/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:42.301 [304/370] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:01:42.301 [305/370] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:42.301 [306/370] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.0.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:42.301 [307/370] Linking static target drivers/librte_crypto_ipsec_mb.a 00:01:42.559 [308/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:42.559 [309/370] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:42.818 [310/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:43.753 [311/370] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.319 [312/370] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.319 [313/370] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.577 [314/370] Linking target lib/librte_eal.so.24.0 00:01:44.577 [315/370] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:44.577 [316/370] Linking target lib/librte_meter.so.24.0 00:01:44.577 [317/370] Linking target lib/librte_timer.so.24.0 00:01:44.577 [318/370] Linking target lib/librte_ring.so.24.0 00:01:44.577 [319/370] Linking target lib/librte_dmadev.so.24.0 00:01:44.577 [320/370] Linking target lib/librte_pci.so.24.0 00:01:44.577 [321/370] Linking target drivers/librte_bus_auxiliary.so.24.0 00:01:44.577 [322/370] Linking target drivers/librte_bus_vdev.so.24.0 00:01:44.835 [323/370] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:01:44.835 [324/370] Generating symbol file drivers/librte_bus_auxiliary.so.24.0.p/librte_bus_auxiliary.so.24.0.symbols 00:01:44.835 [325/370] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:44.835 [326/370] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:44.835 [327/370] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:44.835 [328/370] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:44.835 [329/370] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:44.835 [330/370] Linking target lib/librte_rcu.so.24.0 00:01:44.835 [331/370] Linking target lib/librte_mempool.so.24.0 00:01:44.835 [332/370] Linking target drivers/librte_bus_pci.so.24.0 00:01:44.835 [333/370] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:01:44.835 [334/370] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:44.835 [335/370] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:45.093 [336/370] Linking target drivers/librte_mempool_ring.so.24.0 00:01:45.093 [337/370] Linking target lib/librte_mbuf.so.24.0 00:01:45.093 [338/370] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:45.093 [339/370] Linking target lib/librte_compressdev.so.24.0 00:01:45.093 [340/370] Linking target lib/librte_reorder.so.24.0 00:01:45.093 [341/370] Linking target lib/librte_net.so.24.0 00:01:45.093 [342/370] Linking target lib/librte_cryptodev.so.24.0 00:01:45.350 [343/370] Generating symbol file lib/librte_compressdev.so.24.0.p/librte_compressdev.so.24.0.symbols 00:01:45.350 [344/370] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:45.350 [345/370] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:45.350 [346/370] Linking target lib/librte_security.so.24.0 00:01:45.350 [347/370] Linking target lib/librte_hash.so.24.0 00:01:45.350 [348/370] Linking target lib/librte_cmdline.so.24.0 00:01:45.350 [349/370] Linking target drivers/librte_compress_isal.so.24.0 00:01:45.350 [350/370] Linking target lib/librte_ethdev.so.24.0 00:01:45.350 [351/370] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:45.350 [352/370] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:01:45.350 [353/370] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:45.607 [354/370] Linking target drivers/librte_common_mlx5.so.24.0 00:01:45.607 [355/370] Linking target lib/librte_power.so.24.0 00:01:45.607 [356/370] Generating symbol file drivers/librte_common_mlx5.so.24.0.p/librte_common_mlx5.so.24.0.symbols 00:01:45.607 [357/370] Linking target drivers/librte_compress_mlx5.so.24.0 00:01:45.607 [358/370] Linking target drivers/librte_crypto_mlx5.so.24.0 00:01:45.865 [359/370] Linking target drivers/librte_crypto_ipsec_mb.so.24.0 00:01:48.396 [360/370] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:48.396 [361/370] Linking static target lib/librte_vhost.a 00:01:48.396 [362/370] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:01:48.396 [363/370] Linking static target drivers/libtmp_rte_common_qat.a 00:01:48.657 [364/370] Generating drivers/rte_common_qat.pmd.c with a custom command 00:01:48.657 [365/370] Compiling C object drivers/librte_common_qat.so.24.0.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:48.657 [366/370] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:48.657 [367/370] Linking static target drivers/librte_common_qat.a 00:01:48.915 [368/370] Linking target drivers/librte_common_qat.so.24.0 00:01:49.175 [369/370] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:49.175 [370/370] Linking target lib/librte_vhost.so.24.0 00:01:49.175 INFO: autodetecting backend as ninja 00:01:49.175 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 48 00:01:50.108 CC lib/log/log.o 00:01:50.108 CC lib/log/log_flags.o 00:01:50.108 CC lib/log/log_deprecated.o 00:01:50.108 CC lib/ut/ut.o 00:01:50.108 CC lib/ut_mock/mock.o 00:01:50.365 LIB libspdk_ut_mock.a 00:01:50.365 LIB libspdk_log.a 00:01:50.365 SO libspdk_ut_mock.so.6.0 00:01:50.365 LIB libspdk_ut.a 00:01:50.365 SO libspdk_ut.so.2.0 00:01:50.365 SO libspdk_log.so.7.0 00:01:50.365 SYMLINK libspdk_ut_mock.so 00:01:50.365 SYMLINK libspdk_ut.so 00:01:50.365 SYMLINK libspdk_log.so 00:01:50.623 CC lib/ioat/ioat.o 00:01:50.623 CXX lib/trace_parser/trace.o 00:01:50.623 CC lib/util/base64.o 00:01:50.623 CC lib/dma/dma.o 00:01:50.623 CC lib/util/bit_array.o 00:01:50.623 CC lib/util/cpuset.o 00:01:50.623 CC lib/util/crc16.o 00:01:50.623 CC lib/util/crc32.o 00:01:50.623 CC lib/util/crc32c.o 00:01:50.623 CC lib/util/crc32_ieee.o 00:01:50.623 CC lib/util/crc64.o 00:01:50.623 CC lib/util/dif.o 00:01:50.623 CC lib/util/fd.o 00:01:50.623 CC lib/util/file.o 00:01:50.623 CC lib/util/hexlify.o 00:01:50.623 CC lib/util/iov.o 00:01:50.623 CC lib/util/math.o 00:01:50.623 CC lib/util/pipe.o 00:01:50.623 CC lib/util/strerror_tls.o 00:01:50.623 CC lib/util/string.o 00:01:50.623 CC lib/util/uuid.o 00:01:50.623 CC lib/util/fd_group.o 00:01:50.623 CC lib/util/xor.o 00:01:50.623 CC lib/util/zipf.o 00:01:50.623 CC lib/vfio_user/host/vfio_user_pci.o 00:01:50.623 CC lib/vfio_user/host/vfio_user.o 00:01:50.880 LIB libspdk_dma.a 00:01:50.880 SO libspdk_dma.so.4.0 00:01:50.880 SYMLINK libspdk_dma.so 00:01:50.880 LIB libspdk_ioat.a 00:01:50.880 SO libspdk_ioat.so.7.0 00:01:51.138 SYMLINK libspdk_ioat.so 00:01:51.138 LIB libspdk_vfio_user.a 00:01:51.138 SO libspdk_vfio_user.so.5.0 00:01:51.138 SYMLINK libspdk_vfio_user.so 00:01:51.138 LIB libspdk_util.a 00:01:51.138 SO libspdk_util.so.9.0 00:01:51.395 SYMLINK libspdk_util.so 00:01:51.653 CC lib/reduce/reduce.o 00:01:51.653 CC lib/vmd/vmd.o 00:01:51.653 CC lib/rdma/common.o 00:01:51.653 CC lib/vmd/led.o 00:01:51.653 CC lib/rdma/rdma_verbs.o 00:01:51.653 CC lib/json/json_parse.o 00:01:51.653 CC lib/idxd/idxd.o 00:01:51.653 CC lib/conf/conf.o 00:01:51.653 CC lib/env_dpdk/env.o 00:01:51.653 CC lib/idxd/idxd_user.o 00:01:51.653 CC lib/json/json_util.o 00:01:51.653 CC lib/env_dpdk/memory.o 00:01:51.653 CC lib/env_dpdk/pci.o 00:01:51.653 CC lib/json/json_write.o 00:01:51.653 CC lib/env_dpdk/init.o 00:01:51.653 CC lib/env_dpdk/threads.o 00:01:51.653 CC lib/env_dpdk/pci_ioat.o 00:01:51.653 CC lib/env_dpdk/pci_virtio.o 00:01:51.653 CC lib/env_dpdk/pci_vmd.o 00:01:51.653 CC lib/env_dpdk/pci_idxd.o 00:01:51.653 CC lib/env_dpdk/pci_event.o 00:01:51.653 CC lib/env_dpdk/sigbus_handler.o 00:01:51.653 CC lib/env_dpdk/pci_dpdk.o 00:01:51.653 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:51.653 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:51.653 LIB libspdk_trace_parser.a 00:01:51.653 SO libspdk_trace_parser.so.5.0 00:01:51.912 SYMLINK libspdk_trace_parser.so 00:01:51.912 LIB libspdk_conf.a 00:01:51.912 SO libspdk_conf.so.6.0 00:01:51.912 LIB libspdk_json.a 00:01:51.912 SYMLINK libspdk_conf.so 00:01:51.912 SO libspdk_json.so.6.0 00:01:51.912 LIB libspdk_rdma.a 00:01:51.912 SYMLINK libspdk_json.so 00:01:51.912 SO libspdk_rdma.so.6.0 00:01:52.171 SYMLINK libspdk_rdma.so 00:01:52.171 LIB libspdk_idxd.a 00:01:52.171 CC lib/jsonrpc/jsonrpc_server.o 00:01:52.171 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:52.171 CC lib/jsonrpc/jsonrpc_client.o 00:01:52.171 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:52.171 SO libspdk_idxd.so.12.0 00:01:52.171 SYMLINK libspdk_idxd.so 00:01:52.171 LIB libspdk_vmd.a 00:01:52.171 SO libspdk_vmd.so.6.0 00:01:52.429 SYMLINK libspdk_vmd.so 00:01:52.429 LIB libspdk_reduce.a 00:01:52.429 SO libspdk_reduce.so.6.0 00:01:52.429 LIB libspdk_jsonrpc.a 00:01:52.429 SYMLINK libspdk_reduce.so 00:01:52.429 SO libspdk_jsonrpc.so.6.0 00:01:52.429 SYMLINK libspdk_jsonrpc.so 00:01:52.686 CC lib/rpc/rpc.o 00:01:52.944 LIB libspdk_rpc.a 00:01:52.944 SO libspdk_rpc.so.6.0 00:01:52.944 SYMLINK libspdk_rpc.so 00:01:53.202 CC lib/notify/notify.o 00:01:53.202 CC lib/trace/trace.o 00:01:53.202 CC lib/trace/trace_flags.o 00:01:53.202 CC lib/notify/notify_rpc.o 00:01:53.202 CC lib/trace/trace_rpc.o 00:01:53.202 CC lib/keyring/keyring.o 00:01:53.202 CC lib/keyring/keyring_rpc.o 00:01:53.202 LIB libspdk_notify.a 00:01:53.202 SO libspdk_notify.so.6.0 00:01:53.458 LIB libspdk_keyring.a 00:01:53.458 SYMLINK libspdk_notify.so 00:01:53.458 LIB libspdk_trace.a 00:01:53.458 SO libspdk_keyring.so.1.0 00:01:53.458 SO libspdk_trace.so.10.0 00:01:53.458 SYMLINK libspdk_keyring.so 00:01:53.458 SYMLINK libspdk_trace.so 00:01:53.715 CC lib/thread/thread.o 00:01:53.715 CC lib/thread/iobuf.o 00:01:53.715 CC lib/sock/sock.o 00:01:53.715 CC lib/sock/sock_rpc.o 00:01:53.715 LIB libspdk_env_dpdk.a 00:01:53.715 SO libspdk_env_dpdk.so.14.0 00:01:53.972 SYMLINK libspdk_env_dpdk.so 00:01:53.972 LIB libspdk_sock.a 00:01:53.972 SO libspdk_sock.so.9.0 00:01:53.972 SYMLINK libspdk_sock.so 00:01:54.230 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:54.230 CC lib/nvme/nvme_ctrlr.o 00:01:54.230 CC lib/nvme/nvme_fabric.o 00:01:54.230 CC lib/nvme/nvme_ns_cmd.o 00:01:54.230 CC lib/nvme/nvme_ns.o 00:01:54.230 CC lib/nvme/nvme_pcie_common.o 00:01:54.230 CC lib/nvme/nvme_pcie.o 00:01:54.230 CC lib/nvme/nvme_qpair.o 00:01:54.230 CC lib/nvme/nvme.o 00:01:54.230 CC lib/nvme/nvme_quirks.o 00:01:54.230 CC lib/nvme/nvme_transport.o 00:01:54.230 CC lib/nvme/nvme_discovery.o 00:01:54.230 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:54.230 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:54.230 CC lib/nvme/nvme_tcp.o 00:01:54.230 CC lib/nvme/nvme_opal.o 00:01:54.230 CC lib/nvme/nvme_io_msg.o 00:01:54.230 CC lib/nvme/nvme_poll_group.o 00:01:54.230 CC lib/nvme/nvme_zns.o 00:01:54.230 CC lib/nvme/nvme_stubs.o 00:01:54.230 CC lib/nvme/nvme_auth.o 00:01:54.230 CC lib/nvme/nvme_cuse.o 00:01:54.230 CC lib/nvme/nvme_rdma.o 00:01:55.165 LIB libspdk_thread.a 00:01:55.165 SO libspdk_thread.so.10.0 00:01:55.423 SYMLINK libspdk_thread.so 00:01:55.423 CC lib/init/json_config.o 00:01:55.423 CC lib/virtio/virtio.o 00:01:55.423 CC lib/accel/accel.o 00:01:55.423 CC lib/blob/blobstore.o 00:01:55.423 CC lib/init/subsystem.o 00:01:55.423 CC lib/accel/accel_rpc.o 00:01:55.423 CC lib/blob/request.o 00:01:55.423 CC lib/virtio/virtio_vhost_user.o 00:01:55.423 CC lib/init/subsystem_rpc.o 00:01:55.423 CC lib/accel/accel_sw.o 00:01:55.423 CC lib/blob/zeroes.o 00:01:55.423 CC lib/init/rpc.o 00:01:55.423 CC lib/virtio/virtio_vfio_user.o 00:01:55.423 CC lib/blob/blob_bs_dev.o 00:01:55.423 CC lib/virtio/virtio_pci.o 00:01:55.680 LIB libspdk_init.a 00:01:55.680 SO libspdk_init.so.5.0 00:01:55.680 LIB libspdk_virtio.a 00:01:55.937 SYMLINK libspdk_init.so 00:01:55.937 SO libspdk_virtio.so.7.0 00:01:55.937 SYMLINK libspdk_virtio.so 00:01:55.937 CC lib/event/app.o 00:01:55.937 CC lib/event/reactor.o 00:01:55.937 CC lib/event/log_rpc.o 00:01:55.937 CC lib/event/app_rpc.o 00:01:55.937 CC lib/event/scheduler_static.o 00:01:56.503 LIB libspdk_event.a 00:01:56.503 SO libspdk_event.so.13.0 00:01:56.503 SYMLINK libspdk_event.so 00:01:56.503 LIB libspdk_accel.a 00:01:56.503 SO libspdk_accel.so.15.0 00:01:56.503 LIB libspdk_nvme.a 00:01:56.503 SYMLINK libspdk_accel.so 00:01:56.761 SO libspdk_nvme.so.13.0 00:01:56.761 CC lib/bdev/bdev.o 00:01:56.761 CC lib/bdev/bdev_rpc.o 00:01:56.761 CC lib/bdev/bdev_zone.o 00:01:56.761 CC lib/bdev/part.o 00:01:56.761 CC lib/bdev/scsi_nvme.o 00:01:57.019 SYMLINK libspdk_nvme.so 00:01:58.394 LIB libspdk_blob.a 00:01:58.394 SO libspdk_blob.so.11.0 00:01:58.652 SYMLINK libspdk_blob.so 00:01:58.652 CC lib/blobfs/blobfs.o 00:01:58.652 CC lib/lvol/lvol.o 00:01:58.652 CC lib/blobfs/tree.o 00:01:59.592 LIB libspdk_bdev.a 00:01:59.592 SO libspdk_bdev.so.15.0 00:01:59.592 SYMLINK libspdk_bdev.so 00:01:59.592 LIB libspdk_blobfs.a 00:01:59.592 SO libspdk_blobfs.so.10.0 00:01:59.592 SYMLINK libspdk_blobfs.so 00:01:59.592 LIB libspdk_lvol.a 00:01:59.592 SO libspdk_lvol.so.10.0 00:01:59.592 CC lib/scsi/dev.o 00:01:59.592 CC lib/nvmf/ctrlr.o 00:01:59.592 CC lib/ublk/ublk.o 00:01:59.592 CC lib/nbd/nbd.o 00:01:59.592 CC lib/scsi/lun.o 00:01:59.592 CC lib/nvmf/ctrlr_discovery.o 00:01:59.592 CC lib/ublk/ublk_rpc.o 00:01:59.592 CC lib/nbd/nbd_rpc.o 00:01:59.592 CC lib/scsi/port.o 00:01:59.592 CC lib/nvmf/ctrlr_bdev.o 00:01:59.592 CC lib/ftl/ftl_core.o 00:01:59.592 CC lib/scsi/scsi.o 00:01:59.592 CC lib/ftl/ftl_init.o 00:01:59.592 CC lib/scsi/scsi_bdev.o 00:01:59.592 CC lib/nvmf/subsystem.o 00:01:59.592 CC lib/ftl/ftl_layout.o 00:01:59.592 CC lib/scsi/scsi_pr.o 00:01:59.592 CC lib/ftl/ftl_debug.o 00:01:59.592 CC lib/nvmf/nvmf.o 00:01:59.592 CC lib/scsi/scsi_rpc.o 00:01:59.592 CC lib/nvmf/nvmf_rpc.o 00:01:59.592 CC lib/scsi/task.o 00:01:59.592 CC lib/ftl/ftl_io.o 00:01:59.592 CC lib/ftl/ftl_sb.o 00:01:59.592 CC lib/nvmf/transport.o 00:01:59.592 CC lib/nvmf/tcp.o 00:01:59.592 CC lib/ftl/ftl_l2p.o 00:01:59.592 CC lib/ftl/ftl_l2p_flat.o 00:01:59.592 CC lib/nvmf/stubs.o 00:01:59.592 CC lib/nvmf/mdns_server.o 00:01:59.592 CC lib/ftl/ftl_nv_cache.o 00:01:59.592 CC lib/nvmf/auth.o 00:01:59.592 CC lib/nvmf/rdma.o 00:01:59.592 CC lib/ftl/ftl_band.o 00:01:59.592 CC lib/ftl/ftl_band_ops.o 00:01:59.592 CC lib/ftl/ftl_writer.o 00:01:59.592 CC lib/ftl/ftl_rq.o 00:01:59.592 CC lib/ftl/ftl_reloc.o 00:01:59.592 CC lib/ftl/ftl_l2p_cache.o 00:01:59.592 CC lib/ftl/ftl_p2l.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_startup.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_md.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_misc.o 00:01:59.592 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:01:59.853 SYMLINK libspdk_lvol.so 00:01:59.853 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:00.113 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:00.113 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:00.113 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:00.113 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:00.113 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:00.113 CC lib/ftl/utils/ftl_conf.o 00:02:00.113 CC lib/ftl/utils/ftl_md.o 00:02:00.113 CC lib/ftl/utils/ftl_mempool.o 00:02:00.113 CC lib/ftl/utils/ftl_bitmap.o 00:02:00.113 CC lib/ftl/utils/ftl_property.o 00:02:00.113 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:00.113 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:00.113 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:00.113 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:00.114 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:00.114 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:00.114 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:00.114 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:00.114 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:00.114 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:00.374 CC lib/ftl/base/ftl_base_dev.o 00:02:00.374 CC lib/ftl/base/ftl_base_bdev.o 00:02:00.374 CC lib/ftl/ftl_trace.o 00:02:00.374 LIB libspdk_nbd.a 00:02:00.374 SO libspdk_nbd.so.7.0 00:02:00.632 LIB libspdk_scsi.a 00:02:00.632 SYMLINK libspdk_nbd.so 00:02:00.632 SO libspdk_scsi.so.9.0 00:02:00.632 SYMLINK libspdk_scsi.so 00:02:00.632 LIB libspdk_ublk.a 00:02:00.632 SO libspdk_ublk.so.3.0 00:02:00.890 SYMLINK libspdk_ublk.so 00:02:00.890 CC lib/vhost/vhost.o 00:02:00.890 CC lib/iscsi/conn.o 00:02:00.890 CC lib/vhost/vhost_rpc.o 00:02:00.890 CC lib/iscsi/init_grp.o 00:02:00.890 CC lib/vhost/vhost_scsi.o 00:02:00.890 CC lib/vhost/vhost_blk.o 00:02:00.890 CC lib/iscsi/iscsi.o 00:02:00.890 CC lib/iscsi/md5.o 00:02:00.890 CC lib/iscsi/param.o 00:02:00.890 CC lib/vhost/rte_vhost_user.o 00:02:00.890 CC lib/iscsi/portal_grp.o 00:02:00.890 CC lib/iscsi/tgt_node.o 00:02:00.890 CC lib/iscsi/iscsi_subsystem.o 00:02:00.890 CC lib/iscsi/iscsi_rpc.o 00:02:00.890 CC lib/iscsi/task.o 00:02:01.148 LIB libspdk_ftl.a 00:02:01.148 SO libspdk_ftl.so.9.0 00:02:01.713 SYMLINK libspdk_ftl.so 00:02:02.004 LIB libspdk_vhost.a 00:02:02.004 SO libspdk_vhost.so.8.0 00:02:02.287 LIB libspdk_nvmf.a 00:02:02.287 SYMLINK libspdk_vhost.so 00:02:02.287 SO libspdk_nvmf.so.18.0 00:02:02.287 LIB libspdk_iscsi.a 00:02:02.287 SO libspdk_iscsi.so.8.0 00:02:02.546 SYMLINK libspdk_nvmf.so 00:02:02.546 SYMLINK libspdk_iscsi.so 00:02:02.804 CC module/env_dpdk/env_dpdk_rpc.o 00:02:02.804 CC module/sock/posix/posix.o 00:02:02.804 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:02.804 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:02.804 CC module/accel/iaa/accel_iaa.o 00:02:02.804 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:02.804 CC module/accel/dsa/accel_dsa.o 00:02:02.804 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:02.804 CC module/keyring/file/keyring.o 00:02:02.804 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:02.804 CC module/accel/iaa/accel_iaa_rpc.o 00:02:02.804 CC module/accel/ioat/accel_ioat.o 00:02:02.804 CC module/accel/ioat/accel_ioat_rpc.o 00:02:02.804 CC module/keyring/file/keyring_rpc.o 00:02:02.804 CC module/accel/dsa/accel_dsa_rpc.o 00:02:02.804 CC module/blob/bdev/blob_bdev.o 00:02:02.804 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:02.804 CC module/accel/error/accel_error.o 00:02:02.804 CC module/accel/error/accel_error_rpc.o 00:02:02.804 CC module/scheduler/gscheduler/gscheduler.o 00:02:02.804 LIB libspdk_env_dpdk_rpc.a 00:02:02.804 SO libspdk_env_dpdk_rpc.so.6.0 00:02:03.062 SYMLINK libspdk_env_dpdk_rpc.so 00:02:03.062 LIB libspdk_keyring_file.a 00:02:03.062 LIB libspdk_scheduler_gscheduler.a 00:02:03.062 LIB libspdk_scheduler_dpdk_governor.a 00:02:03.062 SO libspdk_scheduler_gscheduler.so.4.0 00:02:03.062 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:03.062 SO libspdk_keyring_file.so.1.0 00:02:03.062 LIB libspdk_accel_error.a 00:02:03.062 LIB libspdk_scheduler_dynamic.a 00:02:03.062 LIB libspdk_accel_ioat.a 00:02:03.062 LIB libspdk_accel_iaa.a 00:02:03.062 SO libspdk_accel_error.so.2.0 00:02:03.062 SO libspdk_scheduler_dynamic.so.4.0 00:02:03.062 SO libspdk_accel_ioat.so.6.0 00:02:03.062 SYMLINK libspdk_scheduler_gscheduler.so 00:02:03.062 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:03.062 SYMLINK libspdk_keyring_file.so 00:02:03.062 SO libspdk_accel_iaa.so.3.0 00:02:03.062 LIB libspdk_accel_dsa.a 00:02:03.062 SYMLINK libspdk_scheduler_dynamic.so 00:02:03.062 SO libspdk_accel_dsa.so.5.0 00:02:03.062 SYMLINK libspdk_accel_error.so 00:02:03.062 SYMLINK libspdk_accel_ioat.so 00:02:03.062 LIB libspdk_blob_bdev.a 00:02:03.062 SYMLINK libspdk_accel_iaa.so 00:02:03.062 SO libspdk_blob_bdev.so.11.0 00:02:03.062 SYMLINK libspdk_accel_dsa.so 00:02:03.321 SYMLINK libspdk_blob_bdev.so 00:02:03.321 CC module/bdev/lvol/vbdev_lvol.o 00:02:03.321 CC module/bdev/split/vbdev_split.o 00:02:03.321 CC module/bdev/null/bdev_null.o 00:02:03.321 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:03.321 CC module/bdev/malloc/bdev_malloc.o 00:02:03.321 CC module/bdev/split/vbdev_split_rpc.o 00:02:03.321 CC module/bdev/null/bdev_null_rpc.o 00:02:03.321 CC module/blobfs/bdev/blobfs_bdev.o 00:02:03.321 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:03.321 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:03.321 CC module/bdev/delay/vbdev_delay.o 00:02:03.321 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:03.321 CC module/bdev/gpt/gpt.o 00:02:03.586 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:03.586 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:03.586 CC module/bdev/gpt/vbdev_gpt.o 00:02:03.586 CC module/bdev/raid/bdev_raid.o 00:02:03.586 CC module/bdev/crypto/vbdev_crypto.o 00:02:03.586 CC module/bdev/ftl/bdev_ftl.o 00:02:03.586 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:03.586 CC module/bdev/raid/bdev_raid_rpc.o 00:02:03.586 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:03.586 CC module/bdev/raid/bdev_raid_sb.o 00:02:03.586 CC module/bdev/raid/raid0.o 00:02:03.586 CC module/bdev/compress/vbdev_compress.o 00:02:03.586 CC module/bdev/nvme/bdev_nvme.o 00:02:03.586 CC module/bdev/raid/raid1.o 00:02:03.586 CC module/bdev/iscsi/bdev_iscsi.o 00:02:03.586 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:03.586 CC module/bdev/raid/concat.o 00:02:03.586 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:03.586 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:03.586 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:03.586 CC module/bdev/nvme/nvme_rpc.o 00:02:03.586 CC module/bdev/aio/bdev_aio.o 00:02:03.586 CC module/bdev/passthru/vbdev_passthru.o 00:02:03.586 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:03.586 CC module/bdev/aio/bdev_aio_rpc.o 00:02:03.586 CC module/bdev/nvme/bdev_mdns_client.o 00:02:03.586 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:03.586 CC module/bdev/error/vbdev_error.o 00:02:03.586 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:03.586 CC module/bdev/nvme/vbdev_opal.o 00:02:03.586 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:03.586 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:03.844 LIB libspdk_sock_posix.a 00:02:03.844 SO libspdk_sock_posix.so.6.0 00:02:03.844 CC module/bdev/error/vbdev_error_rpc.o 00:02:03.844 LIB libspdk_blobfs_bdev.a 00:02:03.844 SO libspdk_blobfs_bdev.so.6.0 00:02:03.844 SYMLINK libspdk_sock_posix.so 00:02:03.844 LIB libspdk_bdev_split.a 00:02:03.844 LIB libspdk_bdev_gpt.a 00:02:03.844 SO libspdk_bdev_split.so.6.0 00:02:03.844 SYMLINK libspdk_blobfs_bdev.so 00:02:03.844 SO libspdk_bdev_gpt.so.6.0 00:02:03.844 LIB libspdk_bdev_null.a 00:02:03.844 LIB libspdk_bdev_passthru.a 00:02:04.102 SO libspdk_bdev_null.so.6.0 00:02:04.102 SYMLINK libspdk_bdev_split.so 00:02:04.102 SO libspdk_bdev_passthru.so.6.0 00:02:04.102 LIB libspdk_bdev_ftl.a 00:02:04.102 SYMLINK libspdk_bdev_gpt.so 00:02:04.102 LIB libspdk_bdev_crypto.a 00:02:04.102 SO libspdk_bdev_ftl.so.6.0 00:02:04.102 SO libspdk_bdev_crypto.so.6.0 00:02:04.102 SYMLINK libspdk_bdev_null.so 00:02:04.102 SYMLINK libspdk_bdev_passthru.so 00:02:04.102 LIB libspdk_bdev_aio.a 00:02:04.102 LIB libspdk_bdev_zone_block.a 00:02:04.102 LIB libspdk_bdev_compress.a 00:02:04.102 LIB libspdk_bdev_error.a 00:02:04.102 SYMLINK libspdk_bdev_ftl.so 00:02:04.102 SO libspdk_bdev_aio.so.6.0 00:02:04.102 LIB libspdk_bdev_iscsi.a 00:02:04.102 SYMLINK libspdk_bdev_crypto.so 00:02:04.102 SO libspdk_bdev_zone_block.so.6.0 00:02:04.102 LIB libspdk_bdev_malloc.a 00:02:04.102 SO libspdk_bdev_compress.so.6.0 00:02:04.102 SO libspdk_bdev_error.so.6.0 00:02:04.102 LIB libspdk_bdev_delay.a 00:02:04.102 SO libspdk_bdev_iscsi.so.6.0 00:02:04.102 SO libspdk_bdev_malloc.so.6.0 00:02:04.102 SO libspdk_bdev_delay.so.6.0 00:02:04.102 SYMLINK libspdk_bdev_aio.so 00:02:04.102 SYMLINK libspdk_bdev_zone_block.so 00:02:04.102 SYMLINK libspdk_bdev_compress.so 00:02:04.102 SYMLINK libspdk_bdev_error.so 00:02:04.102 SYMLINK libspdk_bdev_iscsi.so 00:02:04.102 LIB libspdk_bdev_lvol.a 00:02:04.102 SYMLINK libspdk_bdev_malloc.so 00:02:04.102 SYMLINK libspdk_bdev_delay.so 00:02:04.102 SO libspdk_bdev_lvol.so.6.0 00:02:04.360 LIB libspdk_bdev_virtio.a 00:02:04.360 SO libspdk_bdev_virtio.so.6.0 00:02:04.360 SYMLINK libspdk_bdev_lvol.so 00:02:04.360 SYMLINK libspdk_bdev_virtio.so 00:02:04.618 LIB libspdk_bdev_raid.a 00:02:04.618 SO libspdk_bdev_raid.so.6.0 00:02:04.876 SYMLINK libspdk_bdev_raid.so 00:02:04.876 LIB libspdk_accel_dpdk_compressdev.a 00:02:04.876 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:04.876 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:05.442 LIB libspdk_accel_dpdk_cryptodev.a 00:02:05.700 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:05.700 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:05.957 LIB libspdk_bdev_nvme.a 00:02:05.958 SO libspdk_bdev_nvme.so.7.0 00:02:05.958 SYMLINK libspdk_bdev_nvme.so 00:02:06.215 CC module/event/subsystems/vmd/vmd.o 00:02:06.215 CC module/event/subsystems/iobuf/iobuf.o 00:02:06.215 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:06.215 CC module/event/subsystems/sock/sock.o 00:02:06.215 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:06.215 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:06.215 CC module/event/subsystems/keyring/keyring.o 00:02:06.215 CC module/event/subsystems/scheduler/scheduler.o 00:02:06.473 LIB libspdk_event_sock.a 00:02:06.473 LIB libspdk_event_keyring.a 00:02:06.473 LIB libspdk_event_vhost_blk.a 00:02:06.473 LIB libspdk_event_scheduler.a 00:02:06.473 LIB libspdk_event_vmd.a 00:02:06.473 SO libspdk_event_sock.so.5.0 00:02:06.473 SO libspdk_event_keyring.so.1.0 00:02:06.473 LIB libspdk_event_iobuf.a 00:02:06.473 SO libspdk_event_scheduler.so.4.0 00:02:06.473 SO libspdk_event_vhost_blk.so.3.0 00:02:06.473 SO libspdk_event_vmd.so.6.0 00:02:06.473 SO libspdk_event_iobuf.so.3.0 00:02:06.473 SYMLINK libspdk_event_sock.so 00:02:06.473 SYMLINK libspdk_event_keyring.so 00:02:06.473 SYMLINK libspdk_event_vhost_blk.so 00:02:06.473 SYMLINK libspdk_event_scheduler.so 00:02:06.473 SYMLINK libspdk_event_vmd.so 00:02:06.473 SYMLINK libspdk_event_iobuf.so 00:02:06.731 CC module/event/subsystems/accel/accel.o 00:02:06.989 LIB libspdk_event_accel.a 00:02:06.989 SO libspdk_event_accel.so.6.0 00:02:06.989 SYMLINK libspdk_event_accel.so 00:02:07.247 CC module/event/subsystems/bdev/bdev.o 00:02:07.247 LIB libspdk_event_bdev.a 00:02:07.247 SO libspdk_event_bdev.so.6.0 00:02:07.505 SYMLINK libspdk_event_bdev.so 00:02:07.505 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:07.505 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:07.505 CC module/event/subsystems/scsi/scsi.o 00:02:07.505 CC module/event/subsystems/nbd/nbd.o 00:02:07.505 CC module/event/subsystems/ublk/ublk.o 00:02:07.762 LIB libspdk_event_nbd.a 00:02:07.763 LIB libspdk_event_ublk.a 00:02:07.763 LIB libspdk_event_scsi.a 00:02:07.763 SO libspdk_event_nbd.so.6.0 00:02:07.763 SO libspdk_event_ublk.so.3.0 00:02:07.763 SO libspdk_event_scsi.so.6.0 00:02:07.763 SYMLINK libspdk_event_ublk.so 00:02:07.763 SYMLINK libspdk_event_nbd.so 00:02:07.763 LIB libspdk_event_nvmf.a 00:02:07.763 SYMLINK libspdk_event_scsi.so 00:02:07.763 SO libspdk_event_nvmf.so.6.0 00:02:07.763 SYMLINK libspdk_event_nvmf.so 00:02:08.020 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:08.020 CC module/event/subsystems/iscsi/iscsi.o 00:02:08.020 LIB libspdk_event_vhost_scsi.a 00:02:08.020 LIB libspdk_event_iscsi.a 00:02:08.020 SO libspdk_event_vhost_scsi.so.3.0 00:02:08.020 SO libspdk_event_iscsi.so.6.0 00:02:08.278 SYMLINK libspdk_event_vhost_scsi.so 00:02:08.278 SYMLINK libspdk_event_iscsi.so 00:02:08.278 SO libspdk.so.6.0 00:02:08.278 SYMLINK libspdk.so 00:02:08.549 CXX app/trace/trace.o 00:02:08.549 CC app/spdk_nvme_perf/perf.o 00:02:08.549 CC app/spdk_top/spdk_top.o 00:02:08.549 CC test/rpc_client/rpc_client_test.o 00:02:08.549 CC app/spdk_nvme_identify/identify.o 00:02:08.549 TEST_HEADER include/spdk/accel.h 00:02:08.549 CC app/spdk_lspci/spdk_lspci.o 00:02:08.549 CC app/spdk_nvme_discover/discovery_aer.o 00:02:08.549 TEST_HEADER include/spdk/accel_module.h 00:02:08.549 CC app/trace_record/trace_record.o 00:02:08.549 TEST_HEADER include/spdk/assert.h 00:02:08.549 TEST_HEADER include/spdk/barrier.h 00:02:08.549 TEST_HEADER include/spdk/base64.h 00:02:08.549 TEST_HEADER include/spdk/bdev.h 00:02:08.549 TEST_HEADER include/spdk/bdev_module.h 00:02:08.549 TEST_HEADER include/spdk/bdev_zone.h 00:02:08.549 TEST_HEADER include/spdk/bit_array.h 00:02:08.549 TEST_HEADER include/spdk/bit_pool.h 00:02:08.549 TEST_HEADER include/spdk/blob_bdev.h 00:02:08.549 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:08.549 TEST_HEADER include/spdk/blobfs.h 00:02:08.549 TEST_HEADER include/spdk/blob.h 00:02:08.549 TEST_HEADER include/spdk/conf.h 00:02:08.549 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:08.549 TEST_HEADER include/spdk/config.h 00:02:08.549 TEST_HEADER include/spdk/cpuset.h 00:02:08.549 CC app/spdk_dd/spdk_dd.o 00:02:08.549 TEST_HEADER include/spdk/crc16.h 00:02:08.549 TEST_HEADER include/spdk/crc32.h 00:02:08.549 TEST_HEADER include/spdk/crc64.h 00:02:08.549 TEST_HEADER include/spdk/dif.h 00:02:08.549 TEST_HEADER include/spdk/dma.h 00:02:08.549 TEST_HEADER include/spdk/endian.h 00:02:08.549 TEST_HEADER include/spdk/env_dpdk.h 00:02:08.549 TEST_HEADER include/spdk/env.h 00:02:08.549 TEST_HEADER include/spdk/event.h 00:02:08.549 TEST_HEADER include/spdk/fd_group.h 00:02:08.549 CC app/iscsi_tgt/iscsi_tgt.o 00:02:08.549 CC app/nvmf_tgt/nvmf_main.o 00:02:08.549 TEST_HEADER include/spdk/fd.h 00:02:08.549 TEST_HEADER include/spdk/file.h 00:02:08.549 TEST_HEADER include/spdk/ftl.h 00:02:08.549 TEST_HEADER include/spdk/gpt_spec.h 00:02:08.549 CC app/vhost/vhost.o 00:02:08.549 TEST_HEADER include/spdk/hexlify.h 00:02:08.549 TEST_HEADER include/spdk/histogram_data.h 00:02:08.549 TEST_HEADER include/spdk/idxd.h 00:02:08.549 TEST_HEADER include/spdk/idxd_spec.h 00:02:08.549 CC test/app/histogram_perf/histogram_perf.o 00:02:08.549 CC examples/sock/hello_world/hello_sock.o 00:02:08.549 TEST_HEADER include/spdk/init.h 00:02:08.549 TEST_HEADER include/spdk/ioat.h 00:02:08.549 CC examples/ioat/perf/perf.o 00:02:08.549 TEST_HEADER include/spdk/ioat_spec.h 00:02:08.549 CC app/spdk_tgt/spdk_tgt.o 00:02:08.549 CC examples/ioat/verify/verify.o 00:02:08.549 TEST_HEADER include/spdk/iscsi_spec.h 00:02:08.549 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:08.549 CC test/app/stub/stub.o 00:02:08.549 TEST_HEADER include/spdk/json.h 00:02:08.549 CC test/nvme/aer/aer.o 00:02:08.549 CC examples/nvme/hello_world/hello_world.o 00:02:08.549 CC examples/util/zipf/zipf.o 00:02:08.549 TEST_HEADER include/spdk/jsonrpc.h 00:02:08.549 CC test/env/vtophys/vtophys.o 00:02:08.549 TEST_HEADER include/spdk/keyring.h 00:02:08.549 CC test/nvme/reset/reset.o 00:02:08.549 CC test/app/jsoncat/jsoncat.o 00:02:08.549 CC examples/accel/perf/accel_perf.o 00:02:08.549 TEST_HEADER include/spdk/keyring_module.h 00:02:08.549 CC app/fio/nvme/fio_plugin.o 00:02:08.810 CC examples/vmd/lsvmd/lsvmd.o 00:02:08.810 TEST_HEADER include/spdk/likely.h 00:02:08.810 CC examples/vmd/led/led.o 00:02:08.810 CC examples/idxd/perf/perf.o 00:02:08.810 TEST_HEADER include/spdk/log.h 00:02:08.810 TEST_HEADER include/spdk/lvol.h 00:02:08.810 CC test/event/event_perf/event_perf.o 00:02:08.810 CC test/thread/poller_perf/poller_perf.o 00:02:08.810 TEST_HEADER include/spdk/memory.h 00:02:08.810 TEST_HEADER include/spdk/mmio.h 00:02:08.810 TEST_HEADER include/spdk/nbd.h 00:02:08.810 TEST_HEADER include/spdk/notify.h 00:02:08.810 TEST_HEADER include/spdk/nvme.h 00:02:08.810 TEST_HEADER include/spdk/nvme_intel.h 00:02:08.810 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:08.810 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:08.810 TEST_HEADER include/spdk/nvme_spec.h 00:02:08.810 TEST_HEADER include/spdk/nvme_zns.h 00:02:08.810 CC test/dma/test_dma/test_dma.o 00:02:08.810 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:08.810 CC test/accel/dif/dif.o 00:02:08.810 CC examples/blob/hello_world/hello_blob.o 00:02:08.810 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:08.810 TEST_HEADER include/spdk/nvmf.h 00:02:08.810 CC test/app/bdev_svc/bdev_svc.o 00:02:08.810 CC examples/bdev/bdevperf/bdevperf.o 00:02:08.810 TEST_HEADER include/spdk/nvmf_spec.h 00:02:08.810 CC examples/blob/cli/blobcli.o 00:02:08.810 TEST_HEADER include/spdk/nvmf_transport.h 00:02:08.810 CC examples/thread/thread/thread_ex.o 00:02:08.810 TEST_HEADER include/spdk/opal.h 00:02:08.810 CC examples/bdev/hello_world/hello_bdev.o 00:02:08.810 CC test/blobfs/mkfs/mkfs.o 00:02:08.810 CC examples/nvmf/nvmf/nvmf.o 00:02:08.810 TEST_HEADER include/spdk/opal_spec.h 00:02:08.810 TEST_HEADER include/spdk/pci_ids.h 00:02:08.810 TEST_HEADER include/spdk/pipe.h 00:02:08.810 CC test/bdev/bdevio/bdevio.o 00:02:08.810 TEST_HEADER include/spdk/queue.h 00:02:08.810 TEST_HEADER include/spdk/reduce.h 00:02:08.810 TEST_HEADER include/spdk/rpc.h 00:02:08.810 TEST_HEADER include/spdk/scheduler.h 00:02:08.810 TEST_HEADER include/spdk/scsi.h 00:02:08.810 TEST_HEADER include/spdk/scsi_spec.h 00:02:08.810 TEST_HEADER include/spdk/sock.h 00:02:08.810 TEST_HEADER include/spdk/stdinc.h 00:02:08.810 TEST_HEADER include/spdk/string.h 00:02:08.810 TEST_HEADER include/spdk/thread.h 00:02:08.810 TEST_HEADER include/spdk/trace.h 00:02:08.810 LINK spdk_lspci 00:02:08.810 TEST_HEADER include/spdk/trace_parser.h 00:02:08.810 TEST_HEADER include/spdk/tree.h 00:02:08.810 TEST_HEADER include/spdk/ublk.h 00:02:08.810 CC test/env/mem_callbacks/mem_callbacks.o 00:02:08.810 TEST_HEADER include/spdk/util.h 00:02:08.810 TEST_HEADER include/spdk/uuid.h 00:02:08.810 TEST_HEADER include/spdk/version.h 00:02:08.810 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:08.810 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:08.810 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:08.810 TEST_HEADER include/spdk/vhost.h 00:02:08.810 TEST_HEADER include/spdk/vmd.h 00:02:08.810 TEST_HEADER include/spdk/xor.h 00:02:08.810 CC test/lvol/esnap/esnap.o 00:02:08.810 TEST_HEADER include/spdk/zipf.h 00:02:08.810 CXX test/cpp_headers/accel.o 00:02:08.810 LINK rpc_client_test 00:02:08.810 LINK spdk_nvme_discover 00:02:09.073 LINK interrupt_tgt 00:02:09.074 LINK jsoncat 00:02:09.074 LINK lsvmd 00:02:09.074 LINK histogram_perf 00:02:09.074 LINK vtophys 00:02:09.074 LINK event_perf 00:02:09.074 LINK zipf 00:02:09.074 LINK env_dpdk_post_init 00:02:09.074 LINK poller_perf 00:02:09.074 LINK led 00:02:09.074 LINK nvmf_tgt 00:02:09.074 LINK vhost 00:02:09.074 LINK spdk_trace_record 00:02:09.074 LINK iscsi_tgt 00:02:09.074 LINK stub 00:02:09.074 LINK ioat_perf 00:02:09.074 LINK spdk_tgt 00:02:09.074 LINK verify 00:02:09.074 LINK hello_world 00:02:09.074 LINK bdev_svc 00:02:09.074 LINK hello_sock 00:02:09.074 LINK mkfs 00:02:09.074 LINK aer 00:02:09.341 LINK hello_blob 00:02:09.341 LINK reset 00:02:09.341 CXX test/cpp_headers/accel_module.o 00:02:09.341 LINK hello_bdev 00:02:09.341 LINK thread 00:02:09.341 LINK spdk_dd 00:02:09.341 LINK idxd_perf 00:02:09.341 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:09.341 CXX test/cpp_headers/assert.o 00:02:09.341 LINK nvmf 00:02:09.341 LINK spdk_trace 00:02:09.341 CC examples/nvme/reconnect/reconnect.o 00:02:09.341 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:09.341 LINK test_dma 00:02:09.605 CC examples/nvme/arbitration/arbitration.o 00:02:09.605 LINK dif 00:02:09.605 CC test/event/reactor_perf/reactor_perf.o 00:02:09.605 CC test/event/reactor/reactor.o 00:02:09.605 CXX test/cpp_headers/barrier.o 00:02:09.605 CC test/nvme/sgl/sgl.o 00:02:09.605 LINK accel_perf 00:02:09.605 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:09.605 CC examples/nvme/hotplug/hotplug.o 00:02:09.605 LINK bdevio 00:02:09.605 CC test/nvme/e2edp/nvme_dp.o 00:02:09.605 CC test/event/app_repeat/app_repeat.o 00:02:09.605 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:09.605 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:09.605 CC app/fio/bdev/fio_plugin.o 00:02:09.605 CC examples/nvme/abort/abort.o 00:02:09.605 CC test/nvme/overhead/overhead.o 00:02:09.605 CXX test/cpp_headers/base64.o 00:02:09.605 CC test/nvme/err_injection/err_injection.o 00:02:09.605 CXX test/cpp_headers/bdev.o 00:02:09.605 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:09.605 CC test/env/memory/memory_ut.o 00:02:09.605 LINK nvme_fuzz 00:02:09.605 CC test/nvme/startup/startup.o 00:02:09.605 LINK blobcli 00:02:09.605 CXX test/cpp_headers/bdev_module.o 00:02:09.605 CC test/event/scheduler/scheduler.o 00:02:09.605 CXX test/cpp_headers/bdev_zone.o 00:02:09.605 CXX test/cpp_headers/bit_array.o 00:02:09.605 CC test/nvme/reserve/reserve.o 00:02:09.870 CC test/nvme/simple_copy/simple_copy.o 00:02:09.870 CC test/env/pci/pci_ut.o 00:02:09.870 LINK spdk_nvme 00:02:09.870 CXX test/cpp_headers/bit_pool.o 00:02:09.870 LINK reactor_perf 00:02:09.870 LINK reactor 00:02:09.870 CC test/nvme/boot_partition/boot_partition.o 00:02:09.870 CC test/nvme/connect_stress/connect_stress.o 00:02:09.870 CC test/nvme/compliance/nvme_compliance.o 00:02:09.870 LINK app_repeat 00:02:09.870 CXX test/cpp_headers/blob_bdev.o 00:02:09.870 CXX test/cpp_headers/blobfs_bdev.o 00:02:09.870 LINK cmb_copy 00:02:09.870 CXX test/cpp_headers/blobfs.o 00:02:09.870 CXX test/cpp_headers/blob.o 00:02:09.870 CXX test/cpp_headers/conf.o 00:02:09.870 CC test/nvme/fused_ordering/fused_ordering.o 00:02:09.870 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:09.870 LINK pmr_persistence 00:02:09.870 LINK err_injection 00:02:10.130 LINK hotplug 00:02:10.130 CC test/nvme/fdp/fdp.o 00:02:10.130 LINK mem_callbacks 00:02:10.130 CXX test/cpp_headers/config.o 00:02:10.130 LINK spdk_nvme_perf 00:02:10.130 CXX test/cpp_headers/cpuset.o 00:02:10.130 LINK sgl 00:02:10.130 LINK startup 00:02:10.130 CXX test/cpp_headers/crc16.o 00:02:10.130 CC test/nvme/cuse/cuse.o 00:02:10.130 LINK arbitration 00:02:10.130 LINK nvme_dp 00:02:10.130 CXX test/cpp_headers/crc32.o 00:02:10.130 LINK reconnect 00:02:10.130 CXX test/cpp_headers/crc64.o 00:02:10.130 LINK overhead 00:02:10.130 CXX test/cpp_headers/dif.o 00:02:10.130 CXX test/cpp_headers/dma.o 00:02:10.130 CXX test/cpp_headers/endian.o 00:02:10.130 CXX test/cpp_headers/env_dpdk.o 00:02:10.130 LINK spdk_nvme_identify 00:02:10.130 LINK scheduler 00:02:10.130 LINK reserve 00:02:10.130 CXX test/cpp_headers/env.o 00:02:10.130 LINK boot_partition 00:02:10.130 LINK spdk_top 00:02:10.130 LINK simple_copy 00:02:10.130 LINK connect_stress 00:02:10.130 CXX test/cpp_headers/event.o 00:02:10.130 LINK bdevperf 00:02:10.389 LINK abort 00:02:10.389 CXX test/cpp_headers/fd_group.o 00:02:10.389 CXX test/cpp_headers/fd.o 00:02:10.389 CXX test/cpp_headers/file.o 00:02:10.389 CXX test/cpp_headers/ftl.o 00:02:10.389 CXX test/cpp_headers/gpt_spec.o 00:02:10.389 CXX test/cpp_headers/hexlify.o 00:02:10.389 CXX test/cpp_headers/histogram_data.o 00:02:10.389 CXX test/cpp_headers/idxd.o 00:02:10.389 CXX test/cpp_headers/idxd_spec.o 00:02:10.389 LINK doorbell_aers 00:02:10.389 CXX test/cpp_headers/init.o 00:02:10.389 LINK vhost_fuzz 00:02:10.389 CXX test/cpp_headers/ioat.o 00:02:10.389 LINK nvme_manage 00:02:10.389 LINK fused_ordering 00:02:10.389 CXX test/cpp_headers/ioat_spec.o 00:02:10.389 CXX test/cpp_headers/iscsi_spec.o 00:02:10.389 CXX test/cpp_headers/json.o 00:02:10.389 CXX test/cpp_headers/jsonrpc.o 00:02:10.389 CXX test/cpp_headers/keyring.o 00:02:10.389 CXX test/cpp_headers/keyring_module.o 00:02:10.389 CXX test/cpp_headers/likely.o 00:02:10.389 CXX test/cpp_headers/log.o 00:02:10.389 CXX test/cpp_headers/lvol.o 00:02:10.389 CXX test/cpp_headers/memory.o 00:02:10.389 CXX test/cpp_headers/mmio.o 00:02:10.389 LINK pci_ut 00:02:10.389 CXX test/cpp_headers/nbd.o 00:02:10.389 CXX test/cpp_headers/notify.o 00:02:10.389 LINK spdk_bdev 00:02:10.389 LINK nvme_compliance 00:02:10.653 CXX test/cpp_headers/nvme.o 00:02:10.653 CXX test/cpp_headers/nvme_intel.o 00:02:10.653 CXX test/cpp_headers/nvme_ocssd.o 00:02:10.653 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:10.653 CXX test/cpp_headers/nvme_spec.o 00:02:10.653 CXX test/cpp_headers/nvme_zns.o 00:02:10.653 CXX test/cpp_headers/nvmf_cmd.o 00:02:10.653 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:10.653 CXX test/cpp_headers/nvmf.o 00:02:10.653 CXX test/cpp_headers/nvmf_spec.o 00:02:10.653 CXX test/cpp_headers/nvmf_transport.o 00:02:10.653 CXX test/cpp_headers/opal.o 00:02:10.653 CXX test/cpp_headers/opal_spec.o 00:02:10.653 CXX test/cpp_headers/pci_ids.o 00:02:10.653 CXX test/cpp_headers/pipe.o 00:02:10.653 CXX test/cpp_headers/queue.o 00:02:10.653 CXX test/cpp_headers/reduce.o 00:02:10.653 LINK fdp 00:02:10.653 CXX test/cpp_headers/rpc.o 00:02:10.653 CXX test/cpp_headers/scsi.o 00:02:10.653 CXX test/cpp_headers/scheduler.o 00:02:10.653 CXX test/cpp_headers/scsi_spec.o 00:02:10.653 CXX test/cpp_headers/sock.o 00:02:10.653 CXX test/cpp_headers/stdinc.o 00:02:10.653 CXX test/cpp_headers/string.o 00:02:10.653 CXX test/cpp_headers/thread.o 00:02:10.653 CXX test/cpp_headers/trace.o 00:02:10.653 CXX test/cpp_headers/trace_parser.o 00:02:10.653 CXX test/cpp_headers/tree.o 00:02:10.913 CXX test/cpp_headers/ublk.o 00:02:10.913 CXX test/cpp_headers/util.o 00:02:10.913 CXX test/cpp_headers/uuid.o 00:02:10.913 CXX test/cpp_headers/version.o 00:02:10.913 CXX test/cpp_headers/vfio_user_pci.o 00:02:10.913 CXX test/cpp_headers/vfio_user_spec.o 00:02:10.913 CXX test/cpp_headers/vhost.o 00:02:10.913 CXX test/cpp_headers/vmd.o 00:02:10.913 CXX test/cpp_headers/xor.o 00:02:10.913 CXX test/cpp_headers/zipf.o 00:02:11.171 LINK memory_ut 00:02:11.738 LINK cuse 00:02:11.738 LINK iscsi_fuzz 00:02:14.264 LINK esnap 00:02:14.829 00:02:14.830 real 1m24.948s 00:02:14.830 user 21m7.359s 00:02:14.830 sys 2m59.590s 00:02:14.830 04:03:02 make -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:02:14.830 04:03:02 make -- common/autotest_common.sh@10 -- $ set +x 00:02:14.830 ************************************ 00:02:14.830 END TEST make 00:02:14.830 ************************************ 00:02:14.830 04:03:02 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:14.830 04:03:02 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:14.830 04:03:02 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:14.830 04:03:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:14.830 04:03:02 -- pm/common@44 -- $ pid=3692912 00:02:14.830 04:03:02 -- pm/common@50 -- $ kill -TERM 3692912 00:02:14.830 04:03:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:14.830 04:03:02 -- pm/common@44 -- $ pid=3692914 00:02:14.830 04:03:02 -- pm/common@50 -- $ kill -TERM 3692914 00:02:14.830 04:03:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:14.830 04:03:02 -- pm/common@44 -- $ pid=3692916 00:02:14.830 04:03:02 -- pm/common@50 -- $ kill -TERM 3692916 00:02:14.830 04:03:02 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:14.830 04:03:02 -- pm/common@44 -- $ pid=3692953 00:02:14.830 04:03:02 -- pm/common@50 -- $ sudo -E kill -TERM 3692953 00:02:14.830 04:03:02 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:14.830 04:03:02 -- nvmf/common.sh@7 -- # uname -s 00:02:14.830 04:03:02 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:14.830 04:03:02 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:14.830 04:03:02 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:14.830 04:03:02 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:14.830 04:03:02 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:14.830 04:03:02 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:14.830 04:03:02 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:14.830 04:03:02 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:14.830 04:03:02 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:14.830 04:03:02 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:14.830 04:03:02 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:02:14.830 04:03:02 -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:02:14.830 04:03:02 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:14.830 04:03:02 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:14.830 04:03:02 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:14.830 04:03:02 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:14.830 04:03:02 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:14.830 04:03:02 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:14.830 04:03:02 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:14.830 04:03:02 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:14.830 04:03:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.830 04:03:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.830 04:03:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.830 04:03:02 -- paths/export.sh@5 -- # export PATH 00:02:14.830 04:03:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:14.830 04:03:02 -- nvmf/common.sh@47 -- # : 0 00:02:14.830 04:03:02 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:14.830 04:03:02 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:14.830 04:03:02 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:14.830 04:03:02 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:14.830 04:03:02 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:14.830 04:03:02 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:14.830 04:03:02 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:14.830 04:03:02 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:14.830 04:03:02 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:14.830 04:03:02 -- spdk/autotest.sh@32 -- # uname -s 00:02:14.830 04:03:02 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:14.830 04:03:02 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:14.830 04:03:02 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:14.830 04:03:02 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:14.830 04:03:02 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:14.830 04:03:02 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:14.830 04:03:02 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:14.830 04:03:02 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:14.830 04:03:02 -- spdk/autotest.sh@48 -- # udevadm_pid=3754962 00:02:14.830 04:03:02 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:14.830 04:03:02 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:14.830 04:03:02 -- pm/common@17 -- # local monitor 00:02:14.830 04:03:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@21 -- # date +%s 00:02:14.830 04:03:02 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:14.830 04:03:02 -- pm/common@21 -- # date +%s 00:02:14.830 04:03:02 -- pm/common@25 -- # sleep 1 00:02:14.830 04:03:02 -- pm/common@21 -- # date +%s 00:02:14.830 04:03:02 -- pm/common@21 -- # date +%s 00:02:14.830 04:03:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715738582 00:02:14.830 04:03:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715738582 00:02:14.830 04:03:02 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715738582 00:02:14.830 04:03:02 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1715738582 00:02:14.830 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715738582_collect-vmstat.pm.log 00:02:14.830 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715738582_collect-cpu-load.pm.log 00:02:14.830 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715738582_collect-cpu-temp.pm.log 00:02:14.830 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1715738582_collect-bmc-pm.bmc.pm.log 00:02:15.764 04:03:03 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:15.764 04:03:03 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:15.764 04:03:03 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:15.764 04:03:03 -- common/autotest_common.sh@10 -- # set +x 00:02:15.764 04:03:03 -- spdk/autotest.sh@59 -- # create_test_list 00:02:15.764 04:03:03 -- common/autotest_common.sh@744 -- # xtrace_disable 00:02:15.764 04:03:03 -- common/autotest_common.sh@10 -- # set +x 00:02:15.764 04:03:03 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:15.764 04:03:03 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:15.764 04:03:03 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:15.764 04:03:03 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:15.764 04:03:03 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:15.764 04:03:03 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:15.764 04:03:03 -- common/autotest_common.sh@1451 -- # uname 00:02:15.764 04:03:03 -- common/autotest_common.sh@1451 -- # '[' Linux = FreeBSD ']' 00:02:15.764 04:03:03 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:15.764 04:03:03 -- common/autotest_common.sh@1471 -- # uname 00:02:15.764 04:03:03 -- common/autotest_common.sh@1471 -- # [[ Linux = FreeBSD ]] 00:02:15.764 04:03:03 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:15.764 04:03:03 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:15.764 04:03:03 -- spdk/autotest.sh@72 -- # hash lcov 00:02:15.764 04:03:03 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:15.764 04:03:03 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:15.764 --rc lcov_branch_coverage=1 00:02:15.764 --rc lcov_function_coverage=1 00:02:15.764 --rc genhtml_branch_coverage=1 00:02:15.764 --rc genhtml_function_coverage=1 00:02:15.764 --rc genhtml_legend=1 00:02:15.764 --rc geninfo_all_blocks=1 00:02:15.764 ' 00:02:15.764 04:03:03 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:15.764 --rc lcov_branch_coverage=1 00:02:15.764 --rc lcov_function_coverage=1 00:02:15.764 --rc genhtml_branch_coverage=1 00:02:15.764 --rc genhtml_function_coverage=1 00:02:15.764 --rc genhtml_legend=1 00:02:15.764 --rc geninfo_all_blocks=1 00:02:15.764 ' 00:02:15.764 04:03:03 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:15.764 --rc lcov_branch_coverage=1 00:02:15.764 --rc lcov_function_coverage=1 00:02:15.764 --rc genhtml_branch_coverage=1 00:02:15.764 --rc genhtml_function_coverage=1 00:02:15.764 --rc genhtml_legend=1 00:02:15.764 --rc geninfo_all_blocks=1 00:02:15.764 --no-external' 00:02:15.764 04:03:03 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:15.764 --rc lcov_branch_coverage=1 00:02:15.764 --rc lcov_function_coverage=1 00:02:15.764 --rc genhtml_branch_coverage=1 00:02:15.764 --rc genhtml_function_coverage=1 00:02:15.764 --rc genhtml_legend=1 00:02:15.764 --rc geninfo_all_blocks=1 00:02:15.764 --no-external' 00:02:15.764 04:03:03 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:16.022 lcov: LCOV version 1.14 00:02:16.022 04:03:03 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:30.882 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:30.882 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:30.882 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno:no functions found 00:02:30.882 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:30.882 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno:no functions found 00:02:30.882 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:30.882 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno:no functions found 00:02:30.882 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:48.993 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:48.993 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:48.994 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:48.994 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:02:48.995 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:02:48.995 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:02:48.995 04:03:36 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:48.995 04:03:36 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:48.995 04:03:36 -- common/autotest_common.sh@10 -- # set +x 00:02:48.995 04:03:36 -- spdk/autotest.sh@91 -- # rm -f 00:02:48.995 04:03:36 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:02:49.928 0000:81:00.0 (8086 0a54): Already using the nvme driver 00:02:49.928 0000:00:04.7 (8086 0e27): Already using the ioatdma driver 00:02:49.928 0000:00:04.6 (8086 0e26): Already using the ioatdma driver 00:02:49.928 0000:00:04.5 (8086 0e25): Already using the ioatdma driver 00:02:49.928 0000:00:04.4 (8086 0e24): Already using the ioatdma driver 00:02:49.928 0000:00:04.3 (8086 0e23): Already using the ioatdma driver 00:02:49.928 0000:00:04.2 (8086 0e22): Already using the ioatdma driver 00:02:49.928 0000:00:04.1 (8086 0e21): Already using the ioatdma driver 00:02:49.928 0000:00:04.0 (8086 0e20): Already using the ioatdma driver 00:02:49.928 0000:80:04.7 (8086 0e27): Already using the ioatdma driver 00:02:49.928 0000:80:04.6 (8086 0e26): Already using the ioatdma driver 00:02:49.928 0000:80:04.5 (8086 0e25): Already using the ioatdma driver 00:02:49.928 0000:80:04.4 (8086 0e24): Already using the ioatdma driver 00:02:49.928 0000:80:04.3 (8086 0e23): Already using the ioatdma driver 00:02:49.928 0000:80:04.2 (8086 0e22): Already using the ioatdma driver 00:02:49.928 0000:80:04.1 (8086 0e21): Already using the ioatdma driver 00:02:49.928 0000:80:04.0 (8086 0e20): Already using the ioatdma driver 00:02:50.186 04:03:37 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:50.186 04:03:37 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:02:50.186 04:03:37 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:02:50.186 04:03:37 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:02:50.186 04:03:37 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:02:50.186 04:03:37 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:02:50.186 04:03:37 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:02:50.186 04:03:37 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:50.186 04:03:37 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:02:50.186 04:03:37 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:50.186 04:03:37 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:50.186 04:03:37 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:50.186 04:03:37 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:50.186 04:03:37 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:50.186 04:03:37 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:50.186 No valid GPT data, bailing 00:02:50.186 04:03:38 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:50.186 04:03:38 -- scripts/common.sh@391 -- # pt= 00:02:50.186 04:03:38 -- scripts/common.sh@392 -- # return 1 00:02:50.186 04:03:38 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:50.186 1+0 records in 00:02:50.186 1+0 records out 00:02:50.186 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00253894 s, 413 MB/s 00:02:50.186 04:03:38 -- spdk/autotest.sh@118 -- # sync 00:02:50.186 04:03:38 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:50.186 04:03:38 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:50.186 04:03:38 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:52.140 04:03:39 -- spdk/autotest.sh@124 -- # uname -s 00:02:52.140 04:03:39 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:52.140 04:03:39 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:52.140 04:03:39 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:52.140 04:03:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:52.140 04:03:39 -- common/autotest_common.sh@10 -- # set +x 00:02:52.140 ************************************ 00:02:52.140 START TEST setup.sh 00:02:52.140 ************************************ 00:02:52.140 04:03:39 setup.sh -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:02:52.140 * Looking for test storage... 00:02:52.140 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:52.140 04:03:39 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:02:52.140 04:03:39 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:52.140 04:03:39 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:52.140 04:03:39 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:52.140 04:03:39 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:52.140 04:03:39 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:02:52.140 ************************************ 00:02:52.140 START TEST acl 00:02:52.140 ************************************ 00:02:52.140 04:03:39 setup.sh.acl -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:02:52.140 * Looking for test storage... 00:02:52.140 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1666 -- # local nvme bdf 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:52.140 04:03:40 setup.sh.acl -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:02:52.140 04:03:40 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:02:52.140 04:03:40 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:52.140 04:03:40 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:02:53.513 04:03:41 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:02:53.513 04:03:41 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:02:53.513 04:03:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:53.513 04:03:41 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:02:53.513 04:03:41 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:02:53.513 04:03:41 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:02:54.885 Hugepages 00:02:54.885 node hugesize free / total 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 00:02:54.885 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:54.885 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:55.142 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:81:00.0 == *:*:*.* ]] 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\8\1\:\0\0\.\0* ]] 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:55.143 04:03:42 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:02:55.143 04:03:42 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:55.143 04:03:42 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:55.143 04:03:42 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:02:55.143 ************************************ 00:02:55.143 START TEST denied 00:02:55.143 ************************************ 00:02:55.143 04:03:42 setup.sh.acl.denied -- common/autotest_common.sh@1121 -- # denied 00:02:55.143 04:03:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:81:00.0' 00:02:55.143 04:03:42 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:02:55.143 04:03:42 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:81:00.0' 00:02:55.143 04:03:42 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:02:55.143 04:03:42 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:02:56.511 0000:81:00.0 (8086 0a54): Skipping denied controller at 0000:81:00.0 00:02:56.511 04:03:44 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:81:00.0 00:02:56.511 04:03:44 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:02:56.511 04:03:44 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:81:00.0 ]] 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:81:00.0/driver 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:56.512 04:03:44 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:02:59.034 00:02:59.034 real 0m4.006s 00:02:59.034 user 0m1.251s 00:02:59.034 sys 0m1.896s 00:02:59.034 04:03:46 setup.sh.acl.denied -- common/autotest_common.sh@1122 -- # xtrace_disable 00:02:59.034 04:03:46 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:02:59.034 ************************************ 00:02:59.034 END TEST denied 00:02:59.034 ************************************ 00:02:59.034 04:03:47 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:59.034 04:03:47 setup.sh.acl -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:59.034 04:03:47 setup.sh.acl -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:59.034 04:03:47 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:02:59.034 ************************************ 00:02:59.034 START TEST allowed 00:02:59.034 ************************************ 00:02:59.034 04:03:47 setup.sh.acl.allowed -- common/autotest_common.sh@1121 -- # allowed 00:02:59.034 04:03:47 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:81:00.0 00:02:59.034 04:03:47 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:02:59.034 04:03:47 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:81:00.0 .*: nvme -> .*' 00:02:59.034 04:03:47 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:02:59.034 04:03:47 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:02.312 0000:81:00.0 (8086 0a54): nvme -> vfio-pci 00:03:02.312 04:03:50 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:02.312 04:03:50 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:02.312 04:03:50 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:02.312 04:03:50 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:02.312 04:03:50 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:04.210 00:03:04.210 real 0m4.897s 00:03:04.210 user 0m1.195s 00:03:04.210 sys 0m1.833s 00:03:04.210 04:03:51 setup.sh.acl.allowed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:04.210 04:03:51 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:04.210 ************************************ 00:03:04.210 END TEST allowed 00:03:04.210 ************************************ 00:03:04.210 00:03:04.210 real 0m12.006s 00:03:04.210 user 0m3.710s 00:03:04.210 sys 0m5.654s 00:03:04.210 04:03:51 setup.sh.acl -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:04.210 04:03:51 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:04.210 ************************************ 00:03:04.210 END TEST acl 00:03:04.210 ************************************ 00:03:04.210 04:03:51 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:04.210 04:03:51 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:04.210 04:03:51 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:04.210 04:03:51 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:04.210 ************************************ 00:03:04.210 START TEST hugepages 00:03:04.210 ************************************ 00:03:04.210 04:03:52 setup.sh.hugepages -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:04.210 * Looking for test storage... 00:03:04.210 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 43901460 kB' 'MemAvailable: 47604404 kB' 'Buffers: 3728 kB' 'Cached: 10110976 kB' 'SwapCached: 0 kB' 'Active: 7098784 kB' 'Inactive: 3504620 kB' 'Active(anon): 6498244 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 492036 kB' 'Mapped: 201224 kB' 'Shmem: 6009544 kB' 'KReclaimable: 183248 kB' 'Slab: 545284 kB' 'SReclaimable: 183248 kB' 'SUnreclaim: 362036 kB' 'KernelStack: 12944 kB' 'PageTables: 8328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36562320 kB' 'Committed_AS: 7623304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198296 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.210 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.211 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:04.212 04:03:52 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:04.212 04:03:52 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:04.212 04:03:52 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:04.212 04:03:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:04.212 ************************************ 00:03:04.212 START TEST default_setup 00:03:04.212 ************************************ 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1121 -- # default_setup 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:04.212 04:03:52 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:05.603 0000:00:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:05.603 0000:00:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:05.603 0000:80:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:07.514 0000:81:00.0 (8086 0a54): nvme -> vfio-pci 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.514 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46007304 kB' 'MemAvailable: 49710276 kB' 'Buffers: 3728 kB' 'Cached: 10111084 kB' 'SwapCached: 0 kB' 'Active: 7122060 kB' 'Inactive: 3504620 kB' 'Active(anon): 6521520 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515124 kB' 'Mapped: 202104 kB' 'Shmem: 6009652 kB' 'KReclaimable: 183304 kB' 'Slab: 544848 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361544 kB' 'KernelStack: 12880 kB' 'PageTables: 7848 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7646508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198396 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.515 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46007156 kB' 'MemAvailable: 49710128 kB' 'Buffers: 3728 kB' 'Cached: 10111084 kB' 'SwapCached: 0 kB' 'Active: 7116592 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516052 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509712 kB' 'Mapped: 201728 kB' 'Shmem: 6009652 kB' 'KReclaimable: 183304 kB' 'Slab: 544988 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361684 kB' 'KernelStack: 12800 kB' 'PageTables: 7680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198328 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.516 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.517 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46006904 kB' 'MemAvailable: 49709876 kB' 'Buffers: 3728 kB' 'Cached: 10111104 kB' 'SwapCached: 0 kB' 'Active: 7116848 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516308 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509896 kB' 'Mapped: 201336 kB' 'Shmem: 6009672 kB' 'KReclaimable: 183304 kB' 'Slab: 544996 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361692 kB' 'KernelStack: 12960 kB' 'PageTables: 8372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198328 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.518 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.519 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:07.520 nr_hugepages=1024 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:07.520 resv_hugepages=0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:07.520 surplus_hugepages=0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:07.520 anon_hugepages=0 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46007500 kB' 'MemAvailable: 49710472 kB' 'Buffers: 3728 kB' 'Cached: 10111124 kB' 'SwapCached: 0 kB' 'Active: 7116748 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516208 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509780 kB' 'Mapped: 201244 kB' 'Shmem: 6009692 kB' 'KReclaimable: 183304 kB' 'Slab: 545028 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361724 kB' 'KernelStack: 12960 kB' 'PageTables: 8368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198328 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.520 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.521 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 21137756 kB' 'MemUsed: 11739184 kB' 'SwapCached: 0 kB' 'Active: 5333576 kB' 'Inactive: 3352784 kB' 'Active(anon): 4980112 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8427732 kB' 'Mapped: 135784 kB' 'AnonPages: 261904 kB' 'Shmem: 4721484 kB' 'KernelStack: 7976 kB' 'PageTables: 4336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301520 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204212 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.522 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:07.523 node0=1024 expecting 1024 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:07.523 00:03:07.523 real 0m3.334s 00:03:07.523 user 0m0.673s 00:03:07.523 sys 0m0.842s 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:07.523 04:03:55 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:07.523 ************************************ 00:03:07.523 END TEST default_setup 00:03:07.523 ************************************ 00:03:07.523 04:03:55 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:07.523 04:03:55 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:07.523 04:03:55 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:07.523 04:03:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:07.780 ************************************ 00:03:07.780 START TEST per_node_1G_alloc 00:03:07.780 ************************************ 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1121 -- # per_node_1G_alloc 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:07.780 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:07.781 04:03:55 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:09.160 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:09.160 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:09.160 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:09.160 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:09.160 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:09.160 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:09.160 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:09.160 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:09.160 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:09.160 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:09.160 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:09.160 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:09.160 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:09.160 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:09.160 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:09.160 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:09.160 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:09.160 04:03:56 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46001348 kB' 'MemAvailable: 49704320 kB' 'Buffers: 3728 kB' 'Cached: 10111188 kB' 'SwapCached: 0 kB' 'Active: 7117024 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516484 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509936 kB' 'Mapped: 201388 kB' 'Shmem: 6009756 kB' 'KReclaimable: 183304 kB' 'Slab: 545052 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361748 kB' 'KernelStack: 12928 kB' 'PageTables: 8228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198520 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.160 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.161 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46001784 kB' 'MemAvailable: 49704756 kB' 'Buffers: 3728 kB' 'Cached: 10111188 kB' 'SwapCached: 0 kB' 'Active: 7117924 kB' 'Inactive: 3504620 kB' 'Active(anon): 6517384 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510844 kB' 'Mapped: 201388 kB' 'Shmem: 6009756 kB' 'KReclaimable: 183304 kB' 'Slab: 545052 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361748 kB' 'KernelStack: 12976 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640180 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198488 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.162 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.163 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46001440 kB' 'MemAvailable: 49704412 kB' 'Buffers: 3728 kB' 'Cached: 10111212 kB' 'SwapCached: 0 kB' 'Active: 7116808 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516268 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509692 kB' 'Mapped: 201248 kB' 'Shmem: 6009780 kB' 'KReclaimable: 183304 kB' 'Slab: 545044 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361740 kB' 'KernelStack: 13008 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198488 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.164 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.165 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:09.166 nr_hugepages=1024 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:09.166 resv_hugepages=0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:09.166 surplus_hugepages=0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:09.166 anon_hugepages=0 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46000684 kB' 'MemAvailable: 49703656 kB' 'Buffers: 3728 kB' 'Cached: 10111236 kB' 'SwapCached: 0 kB' 'Active: 7116800 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516260 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509656 kB' 'Mapped: 201248 kB' 'Shmem: 6009804 kB' 'KReclaimable: 183304 kB' 'Slab: 545044 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361740 kB' 'KernelStack: 12992 kB' 'PageTables: 8320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7640228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198488 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.166 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:09.167 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 22186892 kB' 'MemUsed: 10690048 kB' 'SwapCached: 0 kB' 'Active: 5334660 kB' 'Inactive: 3352784 kB' 'Active(anon): 4981196 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8427792 kB' 'Mapped: 135788 kB' 'AnonPages: 262800 kB' 'Shmem: 4721544 kB' 'KernelStack: 8040 kB' 'PageTables: 4356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301584 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204276 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.168 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.169 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27664796 kB' 'MemFree: 23813540 kB' 'MemUsed: 3851256 kB' 'SwapCached: 0 kB' 'Active: 1782076 kB' 'Inactive: 151836 kB' 'Active(anon): 1535000 kB' 'Inactive(anon): 0 kB' 'Active(file): 247076 kB' 'Inactive(file): 151836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1687208 kB' 'Mapped: 65460 kB' 'AnonPages: 246736 kB' 'Shmem: 1288296 kB' 'KernelStack: 4936 kB' 'PageTables: 3916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85996 kB' 'Slab: 243460 kB' 'SReclaimable: 85996 kB' 'SUnreclaim: 157464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.170 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:09.171 node0=512 expecting 512 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:09.171 node1=512 expecting 512 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:09.171 00:03:09.171 real 0m1.626s 00:03:09.171 user 0m0.707s 00:03:09.171 sys 0m0.889s 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:09.171 04:03:57 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:09.171 ************************************ 00:03:09.171 END TEST per_node_1G_alloc 00:03:09.171 ************************************ 00:03:09.429 04:03:57 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:09.429 04:03:57 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:09.429 04:03:57 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:09.429 04:03:57 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:09.429 ************************************ 00:03:09.429 START TEST even_2G_alloc 00:03:09.429 ************************************ 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1121 -- # even_2G_alloc 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:09.429 04:03:57 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:10.809 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:10.809 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:10.809 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:10.809 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:10.809 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:10.809 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:10.809 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:10.809 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:10.809 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:10.809 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:10.809 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:10.809 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:10.809 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:10.809 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:10.809 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:10.809 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:10.809 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45987076 kB' 'MemAvailable: 49690048 kB' 'Buffers: 3728 kB' 'Cached: 10111332 kB' 'SwapCached: 0 kB' 'Active: 7117392 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516852 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 510144 kB' 'Mapped: 201336 kB' 'Shmem: 6009900 kB' 'KReclaimable: 183304 kB' 'Slab: 544796 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361492 kB' 'KernelStack: 13056 kB' 'PageTables: 8464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7641972 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198552 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.809 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.810 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45987076 kB' 'MemAvailable: 49690048 kB' 'Buffers: 3728 kB' 'Cached: 10111332 kB' 'SwapCached: 0 kB' 'Active: 7118240 kB' 'Inactive: 3504620 kB' 'Active(anon): 6517700 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511008 kB' 'Mapped: 201336 kB' 'Shmem: 6009900 kB' 'KReclaimable: 183304 kB' 'Slab: 544788 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361484 kB' 'KernelStack: 13264 kB' 'PageTables: 8796 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7641620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198664 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.811 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45986320 kB' 'MemAvailable: 49689292 kB' 'Buffers: 3728 kB' 'Cached: 10111352 kB' 'SwapCached: 0 kB' 'Active: 7118904 kB' 'Inactive: 3504620 kB' 'Active(anon): 6518364 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511680 kB' 'Mapped: 201772 kB' 'Shmem: 6009920 kB' 'KReclaimable: 183304 kB' 'Slab: 544840 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361536 kB' 'KernelStack: 13360 kB' 'PageTables: 8916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7642736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198712 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:10.812 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.813 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:10.814 nr_hugepages=1024 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:10.814 resv_hugepages=0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:10.814 surplus_hugepages=0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:10.814 anon_hugepages=0 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:10.814 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45977968 kB' 'MemAvailable: 49680940 kB' 'Buffers: 3728 kB' 'Cached: 10111372 kB' 'SwapCached: 0 kB' 'Active: 7122340 kB' 'Inactive: 3504620 kB' 'Active(anon): 6521800 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 515056 kB' 'Mapped: 201788 kB' 'Shmem: 6009940 kB' 'KReclaimable: 183304 kB' 'Slab: 544840 kB' 'SReclaimable: 183304 kB' 'SUnreclaim: 361536 kB' 'KernelStack: 13504 kB' 'PageTables: 9904 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7647428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198648 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.815 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.816 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 22174096 kB' 'MemUsed: 10702844 kB' 'SwapCached: 0 kB' 'Active: 5341084 kB' 'Inactive: 3352784 kB' 'Active(anon): 4987620 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8427868 kB' 'Mapped: 136660 kB' 'AnonPages: 269196 kB' 'Shmem: 4721620 kB' 'KernelStack: 7992 kB' 'PageTables: 4352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301584 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204276 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.817 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27664796 kB' 'MemFree: 23806004 kB' 'MemUsed: 3858792 kB' 'SwapCached: 0 kB' 'Active: 1783348 kB' 'Inactive: 151836 kB' 'Active(anon): 1536272 kB' 'Inactive(anon): 0 kB' 'Active(file): 247076 kB' 'Inactive(file): 151836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1687256 kB' 'Mapped: 65616 kB' 'AnonPages: 247936 kB' 'Shmem: 1288344 kB' 'KernelStack: 5304 kB' 'PageTables: 5148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85996 kB' 'Slab: 243248 kB' 'SReclaimable: 85996 kB' 'SUnreclaim: 157252 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.818 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:10.819 node0=512 expecting 512 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:10.819 node1=512 expecting 512 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:10.819 00:03:10.819 real 0m1.556s 00:03:10.819 user 0m0.627s 00:03:10.819 sys 0m0.897s 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:10.819 04:03:58 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:10.819 ************************************ 00:03:10.819 END TEST even_2G_alloc 00:03:10.819 ************************************ 00:03:10.819 04:03:58 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:10.819 04:03:58 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:10.819 04:03:58 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:10.819 04:03:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:10.819 ************************************ 00:03:10.819 START TEST odd_alloc 00:03:10.819 ************************************ 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1121 -- # odd_alloc 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:10.819 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:10.820 04:03:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:12.193 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:12.193 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:12.193 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:12.193 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:12.193 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:12.193 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:12.193 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:12.193 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:12.193 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:12.193 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:12.193 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:12.193 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:12.193 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:12.193 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:12.193 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:12.193 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:12.193 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.193 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45991832 kB' 'MemAvailable: 49694820 kB' 'Buffers: 3728 kB' 'Cached: 10111460 kB' 'SwapCached: 0 kB' 'Active: 7112624 kB' 'Inactive: 3504620 kB' 'Active(anon): 6512084 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505468 kB' 'Mapped: 200292 kB' 'Shmem: 6010028 kB' 'KReclaimable: 183336 kB' 'Slab: 544680 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361344 kB' 'KernelStack: 12960 kB' 'PageTables: 7784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37609872 kB' 'Committed_AS: 7615556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198376 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.194 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.195 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46004480 kB' 'MemAvailable: 49707468 kB' 'Buffers: 3728 kB' 'Cached: 10111460 kB' 'SwapCached: 0 kB' 'Active: 7112964 kB' 'Inactive: 3504620 kB' 'Active(anon): 6512424 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 505808 kB' 'Mapped: 200292 kB' 'Shmem: 6010028 kB' 'KReclaimable: 183336 kB' 'Slab: 544680 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361344 kB' 'KernelStack: 12960 kB' 'PageTables: 7784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37609872 kB' 'Committed_AS: 7615572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198328 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.458 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.459 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46004284 kB' 'MemAvailable: 49707272 kB' 'Buffers: 3728 kB' 'Cached: 10111480 kB' 'SwapCached: 0 kB' 'Active: 7112188 kB' 'Inactive: 3504620 kB' 'Active(anon): 6511648 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504952 kB' 'Mapped: 200228 kB' 'Shmem: 6010048 kB' 'KReclaimable: 183336 kB' 'Slab: 544664 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361328 kB' 'KernelStack: 12944 kB' 'PageTables: 7708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37609872 kB' 'Committed_AS: 7615592 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198328 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.460 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.461 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:12.462 nr_hugepages=1025 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:12.462 resv_hugepages=0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:12.462 surplus_hugepages=0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:12.462 anon_hugepages=0 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 46004456 kB' 'MemAvailable: 49707444 kB' 'Buffers: 3728 kB' 'Cached: 10111500 kB' 'SwapCached: 0 kB' 'Active: 7111780 kB' 'Inactive: 3504620 kB' 'Active(anon): 6511240 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504520 kB' 'Mapped: 200228 kB' 'Shmem: 6010068 kB' 'KReclaimable: 183336 kB' 'Slab: 544680 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361344 kB' 'KernelStack: 12912 kB' 'PageTables: 7640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37609872 kB' 'Committed_AS: 7615616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198344 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.462 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.463 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 22175944 kB' 'MemUsed: 10700996 kB' 'SwapCached: 0 kB' 'Active: 5333156 kB' 'Inactive: 3352784 kB' 'Active(anon): 4979692 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8427924 kB' 'Mapped: 134764 kB' 'AnonPages: 261100 kB' 'Shmem: 4721676 kB' 'KernelStack: 7992 kB' 'PageTables: 4176 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301652 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204344 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.464 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.465 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27664796 kB' 'MemFree: 23828512 kB' 'MemUsed: 3836284 kB' 'SwapCached: 0 kB' 'Active: 1778696 kB' 'Inactive: 151836 kB' 'Active(anon): 1531620 kB' 'Inactive(anon): 0 kB' 'Active(file): 247076 kB' 'Inactive(file): 151836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1687340 kB' 'Mapped: 65464 kB' 'AnonPages: 243400 kB' 'Shmem: 1288428 kB' 'KernelStack: 4904 kB' 'PageTables: 3424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86028 kB' 'Slab: 243028 kB' 'SReclaimable: 86028 kB' 'SUnreclaim: 157000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.466 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:12.467 node0=512 expecting 513 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:12.467 node1=513 expecting 512 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:12.467 00:03:12.467 real 0m1.522s 00:03:12.467 user 0m0.635s 00:03:12.467 sys 0m0.850s 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:12.467 04:04:00 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:12.467 ************************************ 00:03:12.467 END TEST odd_alloc 00:03:12.467 ************************************ 00:03:12.467 04:04:00 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:12.467 04:04:00 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:12.467 04:04:00 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:12.467 04:04:00 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:12.467 ************************************ 00:03:12.467 START TEST custom_alloc 00:03:12.467 ************************************ 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1121 -- # custom_alloc 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:12.467 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:12.468 04:04:00 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:13.845 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:13.845 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:13.845 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:13.845 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:13.845 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:13.845 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:13.845 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:13.845 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:13.845 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:13.845 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:13.845 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:13.845 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:13.845 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:13.845 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:13.845 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:13.845 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:13.845 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 44958896 kB' 'MemAvailable: 48661884 kB' 'Buffers: 3728 kB' 'Cached: 10111596 kB' 'SwapCached: 0 kB' 'Active: 7110720 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510180 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503232 kB' 'Mapped: 200304 kB' 'Shmem: 6010164 kB' 'KReclaimable: 183336 kB' 'Slab: 544600 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361264 kB' 'KernelStack: 13024 kB' 'PageTables: 7856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37086608 kB' 'Committed_AS: 7615824 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198456 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.845 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:13.846 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 44959332 kB' 'MemAvailable: 48662320 kB' 'Buffers: 3728 kB' 'Cached: 10111596 kB' 'SwapCached: 0 kB' 'Active: 7111352 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510812 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503888 kB' 'Mapped: 200304 kB' 'Shmem: 6010164 kB' 'KReclaimable: 183336 kB' 'Slab: 544584 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361248 kB' 'KernelStack: 13008 kB' 'PageTables: 7808 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37086608 kB' 'Committed_AS: 7615840 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198424 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.847 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.848 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 44959848 kB' 'MemAvailable: 48662836 kB' 'Buffers: 3728 kB' 'Cached: 10111616 kB' 'SwapCached: 0 kB' 'Active: 7110780 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510240 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503248 kB' 'Mapped: 200240 kB' 'Shmem: 6010184 kB' 'KReclaimable: 183336 kB' 'Slab: 544584 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361248 kB' 'KernelStack: 12976 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37086608 kB' 'Committed_AS: 7615864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198424 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.849 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:13.850 nr_hugepages=1536 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:13.850 resv_hugepages=0 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:13.850 surplus_hugepages=0 00:03:13.850 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:13.850 anon_hugepages=0 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 44960312 kB' 'MemAvailable: 48663300 kB' 'Buffers: 3728 kB' 'Cached: 10111636 kB' 'SwapCached: 0 kB' 'Active: 7110736 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510196 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503180 kB' 'Mapped: 200240 kB' 'Shmem: 6010204 kB' 'KReclaimable: 183336 kB' 'Slab: 544608 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361272 kB' 'KernelStack: 12960 kB' 'PageTables: 7684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37086608 kB' 'Committed_AS: 7615884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198424 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.851 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:13.852 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:13.852 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:13.852 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.112 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 22166276 kB' 'MemUsed: 10710664 kB' 'SwapCached: 0 kB' 'Active: 5332124 kB' 'Inactive: 3352784 kB' 'Active(anon): 4978660 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8428000 kB' 'Mapped: 134776 kB' 'AnonPages: 259996 kB' 'Shmem: 4721752 kB' 'KernelStack: 7992 kB' 'PageTables: 4048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301660 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204352 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.113 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:14.114 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27664796 kB' 'MemFree: 22794036 kB' 'MemUsed: 4870760 kB' 'SwapCached: 0 kB' 'Active: 1778596 kB' 'Inactive: 151836 kB' 'Active(anon): 1531520 kB' 'Inactive(anon): 0 kB' 'Active(file): 247076 kB' 'Inactive(file): 151836 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1687384 kB' 'Mapped: 65464 kB' 'AnonPages: 243148 kB' 'Shmem: 1288472 kB' 'KernelStack: 4952 kB' 'PageTables: 3592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 86028 kB' 'Slab: 242940 kB' 'SReclaimable: 86028 kB' 'SUnreclaim: 156912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.115 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:14.116 node0=512 expecting 512 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:14.116 node1=1024 expecting 1024 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:14.116 00:03:14.116 real 0m1.526s 00:03:14.116 user 0m0.680s 00:03:14.116 sys 0m0.812s 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:14.116 04:04:01 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:14.116 ************************************ 00:03:14.116 END TEST custom_alloc 00:03:14.116 ************************************ 00:03:14.116 04:04:01 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:14.116 04:04:01 setup.sh.hugepages -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:14.116 04:04:01 setup.sh.hugepages -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:14.116 04:04:01 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:14.116 ************************************ 00:03:14.116 START TEST no_shrink_alloc 00:03:14.116 ************************************ 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1121 -- # no_shrink_alloc 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:14.116 04:04:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:15.492 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:15.492 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:15.492 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:15.492 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:15.492 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:15.492 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:15.492 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:15.492 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:15.492 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:15.492 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:15.492 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:15.492 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:15.492 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:15.492 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:15.492 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:15.492 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:15.492 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:15.492 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45972288 kB' 'MemAvailable: 49675276 kB' 'Buffers: 3728 kB' 'Cached: 10111720 kB' 'SwapCached: 0 kB' 'Active: 7116588 kB' 'Inactive: 3504620 kB' 'Active(anon): 6516048 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509064 kB' 'Mapped: 201188 kB' 'Shmem: 6010288 kB' 'KReclaimable: 183336 kB' 'Slab: 544532 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361196 kB' 'KernelStack: 12992 kB' 'PageTables: 7820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7622192 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198428 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.493 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45973628 kB' 'MemAvailable: 49676616 kB' 'Buffers: 3728 kB' 'Cached: 10111724 kB' 'SwapCached: 0 kB' 'Active: 7111640 kB' 'Inactive: 3504620 kB' 'Active(anon): 6511100 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 504180 kB' 'Mapped: 200828 kB' 'Shmem: 6010292 kB' 'KReclaimable: 183336 kB' 'Slab: 544540 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361204 kB' 'KernelStack: 13056 kB' 'PageTables: 8020 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198392 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.494 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.495 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45973656 kB' 'MemAvailable: 49676644 kB' 'Buffers: 3728 kB' 'Cached: 10111740 kB' 'SwapCached: 0 kB' 'Active: 7111364 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510824 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503884 kB' 'Mapped: 200348 kB' 'Shmem: 6010308 kB' 'KReclaimable: 183336 kB' 'Slab: 544516 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361180 kB' 'KernelStack: 13040 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616112 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198376 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.496 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.497 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:15.498 nr_hugepages=1024 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:15.498 resv_hugepages=0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:15.498 surplus_hugepages=0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:15.498 anon_hugepages=0 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45973124 kB' 'MemAvailable: 49676112 kB' 'Buffers: 3728 kB' 'Cached: 10111764 kB' 'SwapCached: 0 kB' 'Active: 7111124 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510584 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503480 kB' 'Mapped: 200268 kB' 'Shmem: 6010332 kB' 'KReclaimable: 183336 kB' 'Slab: 544484 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361148 kB' 'KernelStack: 13008 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198376 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.498 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.499 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.757 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 21122000 kB' 'MemUsed: 11754940 kB' 'SwapCached: 0 kB' 'Active: 5332884 kB' 'Inactive: 3352784 kB' 'Active(anon): 4979420 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8428128 kB' 'Mapped: 134804 kB' 'AnonPages: 260764 kB' 'Shmem: 4721880 kB' 'KernelStack: 8120 kB' 'PageTables: 4360 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301612 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204304 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.758 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:15.759 node0=1024 expecting 1024 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:15.759 04:04:03 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:17.135 0000:00:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:17.135 0000:81:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:17.135 0000:00:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:17.135 0000:00:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:17.135 0000:00:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:17.135 0000:00:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:17.135 0000:00:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:17.135 0000:00:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:17.135 0000:00:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:17.135 0000:80:04.7 (8086 0e27): Already using the vfio-pci driver 00:03:17.135 0000:80:04.6 (8086 0e26): Already using the vfio-pci driver 00:03:17.135 0000:80:04.5 (8086 0e25): Already using the vfio-pci driver 00:03:17.135 0000:80:04.4 (8086 0e24): Already using the vfio-pci driver 00:03:17.135 0000:80:04.3 (8086 0e23): Already using the vfio-pci driver 00:03:17.135 0000:80:04.2 (8086 0e22): Already using the vfio-pci driver 00:03:17.135 0000:80:04.1 (8086 0e21): Already using the vfio-pci driver 00:03:17.135 0000:80:04.0 (8086 0e20): Already using the vfio-pci driver 00:03:17.135 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.135 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45960488 kB' 'MemAvailable: 49663476 kB' 'Buffers: 3728 kB' 'Cached: 10111856 kB' 'SwapCached: 0 kB' 'Active: 7111100 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510560 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503840 kB' 'Mapped: 200380 kB' 'Shmem: 6010424 kB' 'KReclaimable: 183336 kB' 'Slab: 544588 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361252 kB' 'KernelStack: 13008 kB' 'PageTables: 7724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616336 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198568 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.136 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:04 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45961112 kB' 'MemAvailable: 49664100 kB' 'Buffers: 3728 kB' 'Cached: 10111856 kB' 'SwapCached: 0 kB' 'Active: 7111476 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510936 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503836 kB' 'Mapped: 200396 kB' 'Shmem: 6010424 kB' 'KReclaimable: 183336 kB' 'Slab: 544692 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361356 kB' 'KernelStack: 12976 kB' 'PageTables: 7644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198552 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.137 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.138 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45961508 kB' 'MemAvailable: 49664496 kB' 'Buffers: 3728 kB' 'Cached: 10111864 kB' 'SwapCached: 0 kB' 'Active: 7111588 kB' 'Inactive: 3504620 kB' 'Active(anon): 6511048 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503920 kB' 'Mapped: 200332 kB' 'Shmem: 6010432 kB' 'KReclaimable: 183336 kB' 'Slab: 544692 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361356 kB' 'KernelStack: 12992 kB' 'PageTables: 7640 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198552 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.139 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.140 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:17.141 nr_hugepages=1024 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:17.141 resv_hugepages=0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:17.141 surplus_hugepages=0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:17.141 anon_hugepages=0 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60541736 kB' 'MemFree: 45961056 kB' 'MemAvailable: 49664044 kB' 'Buffers: 3728 kB' 'Cached: 10111880 kB' 'SwapCached: 0 kB' 'Active: 7111180 kB' 'Inactive: 3504620 kB' 'Active(anon): 6510640 kB' 'Inactive(anon): 0 kB' 'Active(file): 600540 kB' 'Inactive(file): 3504620 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 503392 kB' 'Mapped: 200256 kB' 'Shmem: 6010448 kB' 'KReclaimable: 183336 kB' 'Slab: 544676 kB' 'SReclaimable: 183336 kB' 'SUnreclaim: 361340 kB' 'KernelStack: 13072 kB' 'PageTables: 7548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37610896 kB' 'Committed_AS: 7616400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 198552 kB' 'VmallocChunk: 0 kB' 'Percpu: 32256 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2256476 kB' 'DirectMap2M: 25974784 kB' 'DirectMap1G: 40894464 kB' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.141 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.142 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32876940 kB' 'MemFree: 21120056 kB' 'MemUsed: 11756884 kB' 'SwapCached: 0 kB' 'Active: 5333320 kB' 'Inactive: 3352784 kB' 'Active(anon): 4979856 kB' 'Inactive(anon): 0 kB' 'Active(file): 353464 kB' 'Inactive(file): 3352784 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8428224 kB' 'Mapped: 134792 kB' 'AnonPages: 261084 kB' 'Shmem: 4721976 kB' 'KernelStack: 8120 kB' 'PageTables: 4316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 97308 kB' 'Slab: 301756 kB' 'SReclaimable: 97308 kB' 'SUnreclaim: 204448 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.143 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:17.144 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:17.144 node0=1024 expecting 1024 00:03:17.145 04:04:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:17.145 00:03:17.145 real 0m3.141s 00:03:17.145 user 0m1.240s 00:03:17.145 sys 0m1.837s 00:03:17.145 04:04:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:17.145 04:04:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:17.145 ************************************ 00:03:17.145 END TEST no_shrink_alloc 00:03:17.145 ************************************ 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:17.145 04:04:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:17.145 00:03:17.145 real 0m13.106s 00:03:17.145 user 0m4.714s 00:03:17.145 sys 0m6.380s 00:03:17.145 04:04:05 setup.sh.hugepages -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:17.145 04:04:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:17.145 ************************************ 00:03:17.145 END TEST hugepages 00:03:17.145 ************************************ 00:03:17.145 04:04:05 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:17.402 04:04:05 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:17.402 04:04:05 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:17.402 04:04:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:17.402 ************************************ 00:03:17.402 START TEST driver 00:03:17.402 ************************************ 00:03:17.402 04:04:05 setup.sh.driver -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:03:17.402 * Looking for test storage... 00:03:17.402 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:17.402 04:04:05 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:03:17.402 04:04:05 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:17.402 04:04:05 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:20.007 04:04:07 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:20.007 04:04:07 setup.sh.driver -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:20.007 04:04:07 setup.sh.driver -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:20.007 04:04:07 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:03:20.007 ************************************ 00:03:20.007 START TEST guess_driver 00:03:20.007 ************************************ 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- common/autotest_common.sh@1121 -- # guess_driver 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 189 > 0 )) 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:20.007 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:20.007 Looking for driver=vfio-pci 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:03:20.007 04:04:07 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:21.380 04:04:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:23.278 04:04:11 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.808 00:03:25.808 real 0m5.809s 00:03:25.808 user 0m1.180s 00:03:25.808 sys 0m1.873s 00:03:25.808 04:04:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:25.808 04:04:13 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:03:25.808 ************************************ 00:03:25.808 END TEST guess_driver 00:03:25.808 ************************************ 00:03:25.808 00:03:25.808 real 0m8.472s 00:03:25.808 user 0m1.846s 00:03:25.808 sys 0m3.013s 00:03:25.808 04:04:13 setup.sh.driver -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:25.808 04:04:13 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:03:25.808 ************************************ 00:03:25.808 END TEST driver 00:03:25.808 ************************************ 00:03:25.808 04:04:13 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:03:25.808 04:04:13 setup.sh -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:25.808 04:04:13 setup.sh -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:25.808 04:04:13 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:25.808 ************************************ 00:03:25.808 START TEST devices 00:03:25.808 ************************************ 00:03:25.808 04:04:13 setup.sh.devices -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:03:25.808 * Looking for test storage... 00:03:25.808 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:25.808 04:04:13 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:25.808 04:04:13 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:03:25.808 04:04:13 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:25.808 04:04:13 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1666 -- # local nvme bdf 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:81:00.0 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\8\1\:\0\0\.\0* ]] 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:03:27.707 04:04:15 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:03:27.707 No valid GPT data, bailing 00:03:27.707 04:04:15 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:03:27.707 04:04:15 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:03:27.707 04:04:15 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:81:00.0 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:27.707 04:04:15 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:03:27.707 ************************************ 00:03:27.707 START TEST nvme_mount 00:03:27.707 ************************************ 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1121 -- # nvme_mount 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:27.707 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:27.708 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:03:27.708 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:27.708 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:27.708 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:27.708 04:04:15 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:03:28.640 Creating new GPT entries in memory. 00:03:28.640 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:28.640 other utilities. 00:03:28.640 04:04:16 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:03:28.640 04:04:16 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:28.640 04:04:16 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:28.640 04:04:16 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:28.640 04:04:16 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:29.573 Creating new GPT entries in memory. 00:03:29.573 The operation has completed successfully. 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 3777424 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:81:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:81:00.0 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:81:00.0 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.573 04:04:17 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:81:00.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:03:30.946 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:30.946 04:04:18 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:31.204 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:03:31.204 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:03:31.204 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:03:31.204 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:81:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:81:00.0 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:81:00.0 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.204 04:04:19 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:81:00.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:81:00.0 data@nvme0n1 '' '' 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:81:00.0 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:81:00.0 00:03:32.578 04:04:20 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:03:32.579 04:04:20 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:03:32.579 04:04:20 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:81:00.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:33.951 04:04:21 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:34.209 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:34.209 00:03:34.209 real 0m6.663s 00:03:34.209 user 0m1.650s 00:03:34.209 sys 0m2.625s 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:34.209 04:04:22 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:03:34.209 ************************************ 00:03:34.209 END TEST nvme_mount 00:03:34.209 ************************************ 00:03:34.209 04:04:22 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:03:34.209 04:04:22 setup.sh.devices -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:34.209 04:04:22 setup.sh.devices -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:34.209 04:04:22 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:03:34.209 ************************************ 00:03:34.209 START TEST dm_mount 00:03:34.209 ************************************ 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- common/autotest_common.sh@1121 -- # dm_mount 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:34.209 04:04:22 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:03:35.581 Creating new GPT entries in memory. 00:03:35.581 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:35.582 other utilities. 00:03:35.582 04:04:23 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:03:35.582 04:04:23 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:35.582 04:04:23 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:35.582 04:04:23 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:35.582 04:04:23 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:36.515 Creating new GPT entries in memory. 00:03:36.515 The operation has completed successfully. 00:03:36.515 04:04:24 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:03:36.515 04:04:24 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:36.515 04:04:24 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:36.515 04:04:24 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:36.515 04:04:24 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:03:37.449 The operation has completed successfully. 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 3780109 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:03:37.449 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:81:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:81:00.0 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:81:00.0 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:03:37.450 04:04:25 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:81:00.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:81:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:81:00.0 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:81:00.0 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:03:38.825 04:04:26 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:81:00.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\8\1\:\0\0\.\0 ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:03:40.200 04:04:27 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:03:40.200 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:03:40.200 00:03:40.200 real 0m5.891s 00:03:40.200 user 0m1.105s 00:03:40.200 sys 0m1.699s 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:40.200 04:04:28 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:03:40.200 ************************************ 00:03:40.200 END TEST dm_mount 00:03:40.200 ************************************ 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:40.200 04:04:28 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:40.459 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:03:40.459 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:03:40.459 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:03:40.459 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:03:40.459 04:04:28 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:03:40.459 00:03:40.459 real 0m14.639s 00:03:40.459 user 0m3.511s 00:03:40.459 sys 0m5.421s 00:03:40.459 04:04:28 setup.sh.devices -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:40.459 04:04:28 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:03:40.459 ************************************ 00:03:40.459 END TEST devices 00:03:40.459 ************************************ 00:03:40.459 00:03:40.459 real 0m48.476s 00:03:40.459 user 0m13.879s 00:03:40.459 sys 0m20.631s 00:03:40.459 04:04:28 setup.sh -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:40.459 04:04:28 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:40.459 ************************************ 00:03:40.459 END TEST setup.sh 00:03:40.459 ************************************ 00:03:40.459 04:04:28 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:41.834 Hugepages 00:03:41.834 node hugesize free / total 00:03:41.834 node0 1048576kB 0 / 0 00:03:41.834 node0 2048kB 1024 / 1024 00:03:41.835 node1 1048576kB 0 / 0 00:03:41.835 node1 2048kB 1024 / 1024 00:03:41.835 00:03:41.835 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:41.835 I/OAT 0000:00:04.0 8086 0e20 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.1 8086 0e21 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.2 8086 0e22 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.3 8086 0e23 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.4 8086 0e24 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.5 8086 0e25 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.6 8086 0e26 0 ioatdma - - 00:03:41.835 I/OAT 0000:00:04.7 8086 0e27 0 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.0 8086 0e20 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.1 8086 0e21 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.2 8086 0e22 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.3 8086 0e23 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.4 8086 0e24 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.5 8086 0e25 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.6 8086 0e26 1 ioatdma - - 00:03:41.835 I/OAT 0000:80:04.7 8086 0e27 1 ioatdma - - 00:03:41.835 NVMe 0000:81:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:03:41.835 04:04:29 -- spdk/autotest.sh@130 -- # uname -s 00:03:41.835 04:04:29 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:03:41.835 04:04:29 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:03:41.835 04:04:29 -- common/autotest_common.sh@1527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:43.209 0000:00:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:43.209 0000:00:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:43.209 0000:80:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:43.209 0000:80:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:43.467 0000:80:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:45.367 0000:81:00.0 (8086 0a54): nvme -> vfio-pci 00:03:45.367 04:04:33 -- common/autotest_common.sh@1528 -- # sleep 1 00:03:46.301 04:04:34 -- common/autotest_common.sh@1529 -- # bdfs=() 00:03:46.301 04:04:34 -- common/autotest_common.sh@1529 -- # local bdfs 00:03:46.301 04:04:34 -- common/autotest_common.sh@1530 -- # bdfs=($(get_nvme_bdfs)) 00:03:46.301 04:04:34 -- common/autotest_common.sh@1530 -- # get_nvme_bdfs 00:03:46.301 04:04:34 -- common/autotest_common.sh@1509 -- # bdfs=() 00:03:46.301 04:04:34 -- common/autotest_common.sh@1509 -- # local bdfs 00:03:46.301 04:04:34 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:03:46.301 04:04:34 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:03:46.301 04:04:34 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:03:46.560 04:04:34 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:03:46.560 04:04:34 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:81:00.0 00:03:46.560 04:04:34 -- common/autotest_common.sh@1532 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:47.941 Waiting for block devices as requested 00:03:47.941 0000:81:00.0 (8086 0a54): vfio-pci -> nvme 00:03:47.941 0000:00:04.7 (8086 0e27): vfio-pci -> ioatdma 00:03:47.941 0000:00:04.6 (8086 0e26): vfio-pci -> ioatdma 00:03:47.941 0000:00:04.5 (8086 0e25): vfio-pci -> ioatdma 00:03:48.199 0000:00:04.4 (8086 0e24): vfio-pci -> ioatdma 00:03:48.199 0000:00:04.3 (8086 0e23): vfio-pci -> ioatdma 00:03:48.199 0000:00:04.2 (8086 0e22): vfio-pci -> ioatdma 00:03:48.199 0000:00:04.1 (8086 0e21): vfio-pci -> ioatdma 00:03:48.199 0000:00:04.0 (8086 0e20): vfio-pci -> ioatdma 00:03:48.458 0000:80:04.7 (8086 0e27): vfio-pci -> ioatdma 00:03:48.458 0000:80:04.6 (8086 0e26): vfio-pci -> ioatdma 00:03:48.458 0000:80:04.5 (8086 0e25): vfio-pci -> ioatdma 00:03:48.718 0000:80:04.4 (8086 0e24): vfio-pci -> ioatdma 00:03:48.718 0000:80:04.3 (8086 0e23): vfio-pci -> ioatdma 00:03:48.718 0000:80:04.2 (8086 0e22): vfio-pci -> ioatdma 00:03:48.718 0000:80:04.1 (8086 0e21): vfio-pci -> ioatdma 00:03:48.977 0000:80:04.0 (8086 0e20): vfio-pci -> ioatdma 00:03:48.977 04:04:36 -- common/autotest_common.sh@1534 -- # for bdf in "${bdfs[@]}" 00:03:48.977 04:04:36 -- common/autotest_common.sh@1535 -- # get_nvme_ctrlr_from_bdf 0000:81:00.0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1498 -- # readlink -f /sys/class/nvme/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1498 -- # grep 0000:81:00.0/nvme/nvme 00:03:48.977 04:04:36 -- common/autotest_common.sh@1498 -- # bdf_sysfs_path=/sys/devices/pci0000:80/0000:80:01.0/0000:81:00.0/nvme/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1499 -- # [[ -z /sys/devices/pci0000:80/0000:80:01.0/0000:81:00.0/nvme/nvme0 ]] 00:03:48.977 04:04:36 -- common/autotest_common.sh@1503 -- # basename /sys/devices/pci0000:80/0000:80:01.0/0000:81:00.0/nvme/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1503 -- # printf '%s\n' nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1535 -- # nvme_ctrlr=/dev/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1536 -- # [[ -z /dev/nvme0 ]] 00:03:48.977 04:04:36 -- common/autotest_common.sh@1541 -- # nvme id-ctrl /dev/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1541 -- # grep oacs 00:03:48.977 04:04:36 -- common/autotest_common.sh@1541 -- # cut -d: -f2 00:03:48.977 04:04:36 -- common/autotest_common.sh@1541 -- # oacs=' 0xe' 00:03:48.977 04:04:36 -- common/autotest_common.sh@1542 -- # oacs_ns_manage=8 00:03:48.977 04:04:36 -- common/autotest_common.sh@1544 -- # [[ 8 -ne 0 ]] 00:03:48.977 04:04:36 -- common/autotest_common.sh@1550 -- # nvme id-ctrl /dev/nvme0 00:03:48.977 04:04:36 -- common/autotest_common.sh@1550 -- # grep unvmcap 00:03:48.977 04:04:36 -- common/autotest_common.sh@1550 -- # cut -d: -f2 00:03:48.977 04:04:36 -- common/autotest_common.sh@1550 -- # unvmcap=' 0' 00:03:48.977 04:04:36 -- common/autotest_common.sh@1551 -- # [[ 0 -eq 0 ]] 00:03:48.977 04:04:36 -- common/autotest_common.sh@1553 -- # continue 00:03:48.977 04:04:36 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:03:48.977 04:04:36 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:48.977 04:04:36 -- common/autotest_common.sh@10 -- # set +x 00:03:48.977 04:04:36 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:03:48.977 04:04:36 -- common/autotest_common.sh@720 -- # xtrace_disable 00:03:48.977 04:04:36 -- common/autotest_common.sh@10 -- # set +x 00:03:48.977 04:04:36 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:50.431 0000:00:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:50.431 0000:00:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.7 (8086 0e27): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.6 (8086 0e26): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.5 (8086 0e25): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.4 (8086 0e24): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.3 (8086 0e23): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.2 (8086 0e22): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.1 (8086 0e21): ioatdma -> vfio-pci 00:03:50.431 0000:80:04.0 (8086 0e20): ioatdma -> vfio-pci 00:03:52.329 0000:81:00.0 (8086 0a54): nvme -> vfio-pci 00:03:52.588 04:04:40 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:03:52.588 04:04:40 -- common/autotest_common.sh@726 -- # xtrace_disable 00:03:52.588 04:04:40 -- common/autotest_common.sh@10 -- # set +x 00:03:52.588 04:04:40 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:03:52.588 04:04:40 -- common/autotest_common.sh@1587 -- # mapfile -t bdfs 00:03:52.588 04:04:40 -- common/autotest_common.sh@1587 -- # get_nvme_bdfs_by_id 0x0a54 00:03:52.588 04:04:40 -- common/autotest_common.sh@1573 -- # bdfs=() 00:03:52.588 04:04:40 -- common/autotest_common.sh@1573 -- # local bdfs 00:03:52.588 04:04:40 -- common/autotest_common.sh@1575 -- # get_nvme_bdfs 00:03:52.588 04:04:40 -- common/autotest_common.sh@1509 -- # bdfs=() 00:03:52.588 04:04:40 -- common/autotest_common.sh@1509 -- # local bdfs 00:03:52.588 04:04:40 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:03:52.588 04:04:40 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:03:52.588 04:04:40 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:03:52.588 04:04:40 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:03:52.588 04:04:40 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:81:00.0 00:03:52.588 04:04:40 -- common/autotest_common.sh@1575 -- # for bdf in $(get_nvme_bdfs) 00:03:52.588 04:04:40 -- common/autotest_common.sh@1576 -- # cat /sys/bus/pci/devices/0000:81:00.0/device 00:03:52.588 04:04:40 -- common/autotest_common.sh@1576 -- # device=0x0a54 00:03:52.588 04:04:40 -- common/autotest_common.sh@1577 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:03:52.588 04:04:40 -- common/autotest_common.sh@1578 -- # bdfs+=($bdf) 00:03:52.588 04:04:40 -- common/autotest_common.sh@1582 -- # printf '%s\n' 0000:81:00.0 00:03:52.588 04:04:40 -- common/autotest_common.sh@1588 -- # [[ -z 0000:81:00.0 ]] 00:03:52.588 04:04:40 -- common/autotest_common.sh@1593 -- # spdk_tgt_pid=3786030 00:03:52.588 04:04:40 -- common/autotest_common.sh@1592 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:03:52.588 04:04:40 -- common/autotest_common.sh@1594 -- # waitforlisten 3786030 00:03:52.588 04:04:40 -- common/autotest_common.sh@827 -- # '[' -z 3786030 ']' 00:03:52.588 04:04:40 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:52.588 04:04:40 -- common/autotest_common.sh@832 -- # local max_retries=100 00:03:52.588 04:04:40 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:52.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:52.588 04:04:40 -- common/autotest_common.sh@836 -- # xtrace_disable 00:03:52.588 04:04:40 -- common/autotest_common.sh@10 -- # set +x 00:03:52.588 [2024-05-15 04:04:40.507186] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:03:52.588 [2024-05-15 04:04:40.507265] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3786030 ] 00:03:52.588 [2024-05-15 04:04:40.579420] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:52.847 [2024-05-15 04:04:40.692415] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:03:53.104 04:04:40 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:03:53.104 04:04:40 -- common/autotest_common.sh@860 -- # return 0 00:03:53.104 04:04:40 -- common/autotest_common.sh@1596 -- # bdf_id=0 00:03:53.104 04:04:40 -- common/autotest_common.sh@1597 -- # for bdf in "${bdfs[@]}" 00:03:53.104 04:04:40 -- common/autotest_common.sh@1598 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:81:00.0 00:03:56.381 nvme0n1 00:03:56.381 04:04:44 -- common/autotest_common.sh@1600 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:03:56.381 [2024-05-15 04:04:44.246962] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:03:56.381 request: 00:03:56.381 { 00:03:56.381 "nvme_ctrlr_name": "nvme0", 00:03:56.381 "password": "test", 00:03:56.381 "method": "bdev_nvme_opal_revert", 00:03:56.381 "req_id": 1 00:03:56.381 } 00:03:56.381 Got JSON-RPC error response 00:03:56.381 response: 00:03:56.381 { 00:03:56.381 "code": -32602, 00:03:56.381 "message": "Invalid parameters" 00:03:56.381 } 00:03:56.381 04:04:44 -- common/autotest_common.sh@1600 -- # true 00:03:56.381 04:04:44 -- common/autotest_common.sh@1601 -- # (( ++bdf_id )) 00:03:56.381 04:04:44 -- common/autotest_common.sh@1604 -- # killprocess 3786030 00:03:56.381 04:04:44 -- common/autotest_common.sh@946 -- # '[' -z 3786030 ']' 00:03:56.381 04:04:44 -- common/autotest_common.sh@950 -- # kill -0 3786030 00:03:56.381 04:04:44 -- common/autotest_common.sh@951 -- # uname 00:03:56.381 04:04:44 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:03:56.381 04:04:44 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3786030 00:03:56.381 04:04:44 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:03:56.381 04:04:44 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:03:56.381 04:04:44 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3786030' 00:03:56.381 killing process with pid 3786030 00:03:56.381 04:04:44 -- common/autotest_common.sh@965 -- # kill 3786030 00:03:56.381 04:04:44 -- common/autotest_common.sh@970 -- # wait 3786030 00:03:59.659 04:04:47 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:03:59.659 04:04:47 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:03:59.659 04:04:47 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:03:59.659 04:04:47 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:03:59.659 04:04:47 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:03:59.659 Restarting all devices. 00:04:02.935 lstat() error: No such file or directory 00:04:02.935 QAT Error: No GENERAL section found 00:04:02.935 Failed to configure qat_dev0 00:04:02.935 lstat() error: No such file or directory 00:04:02.935 QAT Error: No GENERAL section found 00:04:02.935 Failed to configure qat_dev1 00:04:02.935 lstat() error: No such file or directory 00:04:02.935 QAT Error: No GENERAL section found 00:04:02.935 Failed to configure qat_dev2 00:04:02.935 enable sriov 00:04:02.935 Checking status of all devices. 00:04:02.935 There is 3 QAT acceleration device(s) in the system: 00:04:02.935 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:0c:00.0, #accel: 5 #engines: 10 state: down 00:04:02.935 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:0d:00.0, #accel: 5 #engines: 10 state: down 00:04:02.935 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:0e:00.0, #accel: 5 #engines: 10 state: down 00:04:03.500 0000:0c:00.0 set to 16 VFs 00:04:04.065 0000:0d:00.0 set to 16 VFs 00:04:04.631 0000:0e:00.0 set to 16 VFs 00:04:04.631 Properly configured the qat device with driver uio_pci_generic. 00:04:04.631 04:04:52 -- spdk/autotest.sh@162 -- # timing_enter lib 00:04:04.631 04:04:52 -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:04.631 04:04:52 -- common/autotest_common.sh@10 -- # set +x 00:04:04.631 04:04:52 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:04:04.631 04:04:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:04.631 04:04:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:04.631 04:04:52 -- common/autotest_common.sh@10 -- # set +x 00:04:04.631 ************************************ 00:04:04.631 START TEST env 00:04:04.631 ************************************ 00:04:04.631 04:04:52 env -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:04:04.631 * Looking for test storage... 00:04:04.631 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:04:04.631 04:04:52 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:04:04.631 04:04:52 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:04.631 04:04:52 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:04.631 04:04:52 env -- common/autotest_common.sh@10 -- # set +x 00:04:04.631 ************************************ 00:04:04.631 START TEST env_memory 00:04:04.631 ************************************ 00:04:04.631 04:04:52 env.env_memory -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:04:04.631 00:04:04.631 00:04:04.631 CUnit - A unit testing framework for C - Version 2.1-3 00:04:04.631 http://cunit.sourceforge.net/ 00:04:04.631 00:04:04.631 00:04:04.631 Suite: memory 00:04:04.631 Test: alloc and free memory map ...[2024-05-15 04:04:52.630121] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:04.631 passed 00:04:04.890 Test: mem map translation ...[2024-05-15 04:04:52.652781] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:04.890 [2024-05-15 04:04:52.652840] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:04.890 [2024-05-15 04:04:52.652902] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:04.890 [2024-05-15 04:04:52.652915] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:04.890 passed 00:04:04.890 Test: mem map registration ...[2024-05-15 04:04:52.699186] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:04.890 [2024-05-15 04:04:52.699212] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:04.890 passed 00:04:04.890 Test: mem map adjacent registrations ...passed 00:04:04.890 00:04:04.890 Run Summary: Type Total Ran Passed Failed Inactive 00:04:04.890 suites 1 1 n/a 0 0 00:04:04.890 tests 4 4 4 0 0 00:04:04.890 asserts 152 152 152 0 n/a 00:04:04.890 00:04:04.890 Elapsed time = 0.157 seconds 00:04:04.890 00:04:04.890 real 0m0.164s 00:04:04.890 user 0m0.158s 00:04:04.890 sys 0m0.005s 00:04:04.890 04:04:52 env.env_memory -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:04.890 04:04:52 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:04:04.890 ************************************ 00:04:04.890 END TEST env_memory 00:04:04.890 ************************************ 00:04:04.890 04:04:52 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:04.890 04:04:52 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:04.890 04:04:52 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:04.890 04:04:52 env -- common/autotest_common.sh@10 -- # set +x 00:04:04.890 ************************************ 00:04:04.890 START TEST env_vtophys 00:04:04.890 ************************************ 00:04:04.890 04:04:52 env.env_vtophys -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:04.890 EAL: lib.eal log level changed from notice to debug 00:04:04.890 EAL: Detected lcore 0 as core 0 on socket 0 00:04:04.891 EAL: Detected lcore 1 as core 1 on socket 0 00:04:04.891 EAL: Detected lcore 2 as core 2 on socket 0 00:04:04.891 EAL: Detected lcore 3 as core 3 on socket 0 00:04:04.891 EAL: Detected lcore 4 as core 4 on socket 0 00:04:04.891 EAL: Detected lcore 5 as core 5 on socket 0 00:04:04.891 EAL: Detected lcore 6 as core 8 on socket 0 00:04:04.891 EAL: Detected lcore 7 as core 9 on socket 0 00:04:04.891 EAL: Detected lcore 8 as core 10 on socket 0 00:04:04.891 EAL: Detected lcore 9 as core 11 on socket 0 00:04:04.891 EAL: Detected lcore 10 as core 12 on socket 0 00:04:04.891 EAL: Detected lcore 11 as core 13 on socket 0 00:04:04.891 EAL: Detected lcore 12 as core 0 on socket 1 00:04:04.891 EAL: Detected lcore 13 as core 1 on socket 1 00:04:04.891 EAL: Detected lcore 14 as core 2 on socket 1 00:04:04.891 EAL: Detected lcore 15 as core 3 on socket 1 00:04:04.891 EAL: Detected lcore 16 as core 4 on socket 1 00:04:04.891 EAL: Detected lcore 17 as core 5 on socket 1 00:04:04.891 EAL: Detected lcore 18 as core 8 on socket 1 00:04:04.891 EAL: Detected lcore 19 as core 9 on socket 1 00:04:04.891 EAL: Detected lcore 20 as core 10 on socket 1 00:04:04.891 EAL: Detected lcore 21 as core 11 on socket 1 00:04:04.891 EAL: Detected lcore 22 as core 12 on socket 1 00:04:04.891 EAL: Detected lcore 23 as core 13 on socket 1 00:04:04.891 EAL: Detected lcore 24 as core 0 on socket 0 00:04:04.891 EAL: Detected lcore 25 as core 1 on socket 0 00:04:04.891 EAL: Detected lcore 26 as core 2 on socket 0 00:04:04.891 EAL: Detected lcore 27 as core 3 on socket 0 00:04:04.891 EAL: Detected lcore 28 as core 4 on socket 0 00:04:04.891 EAL: Detected lcore 29 as core 5 on socket 0 00:04:04.891 EAL: Detected lcore 30 as core 8 on socket 0 00:04:04.891 EAL: Detected lcore 31 as core 9 on socket 0 00:04:04.891 EAL: Detected lcore 32 as core 10 on socket 0 00:04:04.891 EAL: Detected lcore 33 as core 11 on socket 0 00:04:04.891 EAL: Detected lcore 34 as core 12 on socket 0 00:04:04.891 EAL: Detected lcore 35 as core 13 on socket 0 00:04:04.891 EAL: Detected lcore 36 as core 0 on socket 1 00:04:04.891 EAL: Detected lcore 37 as core 1 on socket 1 00:04:04.891 EAL: Detected lcore 38 as core 2 on socket 1 00:04:04.891 EAL: Detected lcore 39 as core 3 on socket 1 00:04:04.891 EAL: Detected lcore 40 as core 4 on socket 1 00:04:04.891 EAL: Detected lcore 41 as core 5 on socket 1 00:04:04.891 EAL: Detected lcore 42 as core 8 on socket 1 00:04:04.891 EAL: Detected lcore 43 as core 9 on socket 1 00:04:04.891 EAL: Detected lcore 44 as core 10 on socket 1 00:04:04.891 EAL: Detected lcore 45 as core 11 on socket 1 00:04:04.891 EAL: Detected lcore 46 as core 12 on socket 1 00:04:04.891 EAL: Detected lcore 47 as core 13 on socket 1 00:04:04.891 EAL: Maximum logical cores by configuration: 128 00:04:04.891 EAL: Detected CPU lcores: 48 00:04:04.891 EAL: Detected NUMA nodes: 2 00:04:04.891 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:04.891 EAL: Detected shared linkage of DPDK 00:04:04.891 EAL: No shared files mode enabled, IPC will be disabled 00:04:04.891 EAL: No shared files mode enabled, IPC is disabled 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:01.7 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0c:02.7 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:01.7 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0d:02.7 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:01.7 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.0 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.1 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.2 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.3 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.4 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.5 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.6 wants IOVA as 'PA' 00:04:04.891 EAL: PCI driver qat for device 0000:0e:02.7 wants IOVA as 'PA' 00:04:04.891 EAL: Bus pci wants IOVA as 'PA' 00:04:04.891 EAL: Bus auxiliary wants IOVA as 'DC' 00:04:04.891 EAL: Bus vdev wants IOVA as 'DC' 00:04:04.891 EAL: Selected IOVA mode 'PA' 00:04:04.891 EAL: Probing VFIO support... 00:04:04.891 EAL: IOMMU type 1 (Type 1) is supported 00:04:04.891 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:04.891 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:04.891 EAL: VFIO support initialized 00:04:04.891 EAL: Ask a virtual area of 0x2e000 bytes 00:04:04.891 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:04.891 EAL: Setting up physically contiguous memory... 00:04:04.891 EAL: Setting maximum number of open files to 524288 00:04:04.891 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:04.891 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:04.891 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:04.891 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:04.891 EAL: Ask a virtual area of 0x61000 bytes 00:04:04.891 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:04.891 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:04.891 EAL: Ask a virtual area of 0x400000000 bytes 00:04:04.891 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:04.891 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:04.891 EAL: Hugepages will be freed exactly as allocated. 00:04:04.891 EAL: No shared files mode enabled, IPC is disabled 00:04:04.891 EAL: No shared files mode enabled, IPC is disabled 00:04:04.891 EAL: TSC frequency is ~2700000 KHz 00:04:04.891 EAL: Main lcore 0 is ready (tid=7fd327d9bb00;cpuset=[0]) 00:04:04.891 EAL: Trying to obtain current memory policy. 00:04:04.891 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:04.891 EAL: Restoring previous memory policy: 0 00:04:04.891 EAL: request: mp_malloc_sync 00:04:04.891 EAL: No shared files mode enabled, IPC is disabled 00:04:04.891 EAL: Heap on socket 0 was expanded by 2MB 00:04:04.891 EAL: PCI device 0000:0c:01.0 on NUMA socket 0 00:04:04.891 EAL: probe driver: 8086:37c9 qat 00:04:04.891 EAL: PCI memory mapped at 0x202001000000 00:04:04.892 EAL: PCI memory mapped at 0x202001001000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.0 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.1 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001002000 00:04:04.892 EAL: PCI memory mapped at 0x202001003000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.1 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.2 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001004000 00:04:04.892 EAL: PCI memory mapped at 0x202001005000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.2 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.3 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001006000 00:04:04.892 EAL: PCI memory mapped at 0x202001007000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.3 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.4 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001008000 00:04:04.892 EAL: PCI memory mapped at 0x202001009000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.4 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.5 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200100a000 00:04:04.892 EAL: PCI memory mapped at 0x20200100b000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.5 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.6 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200100c000 00:04:04.892 EAL: PCI memory mapped at 0x20200100d000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.6 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:01.7 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200100e000 00:04:04.892 EAL: PCI memory mapped at 0x20200100f000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.7 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.0 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001010000 00:04:04.892 EAL: PCI memory mapped at 0x202001011000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.0 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.1 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001012000 00:04:04.892 EAL: PCI memory mapped at 0x202001013000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.1 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.2 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001014000 00:04:04.892 EAL: PCI memory mapped at 0x202001015000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.2 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.3 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001016000 00:04:04.892 EAL: PCI memory mapped at 0x202001017000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.3 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.4 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001018000 00:04:04.892 EAL: PCI memory mapped at 0x202001019000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.4 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.5 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200101a000 00:04:04.892 EAL: PCI memory mapped at 0x20200101b000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.5 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.6 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200101c000 00:04:04.892 EAL: PCI memory mapped at 0x20200101d000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.6 (socket 0) 00:04:04.892 EAL: PCI device 0000:0c:02.7 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200101e000 00:04:04.892 EAL: PCI memory mapped at 0x20200101f000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.7 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.0 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001020000 00:04:04.892 EAL: PCI memory mapped at 0x202001021000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.0 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.1 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001022000 00:04:04.892 EAL: PCI memory mapped at 0x202001023000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.1 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.2 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001024000 00:04:04.892 EAL: PCI memory mapped at 0x202001025000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.2 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.3 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001026000 00:04:04.892 EAL: PCI memory mapped at 0x202001027000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.3 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.4 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001028000 00:04:04.892 EAL: PCI memory mapped at 0x202001029000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.4 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.5 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200102a000 00:04:04.892 EAL: PCI memory mapped at 0x20200102b000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.5 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.6 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200102c000 00:04:04.892 EAL: PCI memory mapped at 0x20200102d000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.6 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:01.7 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200102e000 00:04:04.892 EAL: PCI memory mapped at 0x20200102f000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.7 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.0 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001030000 00:04:04.892 EAL: PCI memory mapped at 0x202001031000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.0 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.1 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001032000 00:04:04.892 EAL: PCI memory mapped at 0x202001033000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.1 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.2 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001034000 00:04:04.892 EAL: PCI memory mapped at 0x202001035000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.2 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.3 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001036000 00:04:04.892 EAL: PCI memory mapped at 0x202001037000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.3 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.4 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001038000 00:04:04.892 EAL: PCI memory mapped at 0x202001039000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.4 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.5 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200103a000 00:04:04.892 EAL: PCI memory mapped at 0x20200103b000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.5 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.6 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200103c000 00:04:04.892 EAL: PCI memory mapped at 0x20200103d000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.6 (socket 0) 00:04:04.892 EAL: PCI device 0000:0d:02.7 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200103e000 00:04:04.892 EAL: PCI memory mapped at 0x20200103f000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.7 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.0 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001040000 00:04:04.892 EAL: PCI memory mapped at 0x202001041000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.0 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.1 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001042000 00:04:04.892 EAL: PCI memory mapped at 0x202001043000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.1 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.2 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001044000 00:04:04.892 EAL: PCI memory mapped at 0x202001045000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.2 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.3 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001046000 00:04:04.892 EAL: PCI memory mapped at 0x202001047000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.3 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.4 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x202001048000 00:04:04.892 EAL: PCI memory mapped at 0x202001049000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.4 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.5 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200104a000 00:04:04.892 EAL: PCI memory mapped at 0x20200104b000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.5 (socket 0) 00:04:04.892 EAL: PCI device 0000:0e:01.6 on NUMA socket 0 00:04:04.892 EAL: probe driver: 8086:37c9 qat 00:04:04.892 EAL: PCI memory mapped at 0x20200104c000 00:04:04.892 EAL: PCI memory mapped at 0x20200104d000 00:04:04.892 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.6 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:01.7 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x20200104e000 00:04:04.893 EAL: PCI memory mapped at 0x20200104f000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.7 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.0 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x202001050000 00:04:04.893 EAL: PCI memory mapped at 0x202001051000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.0 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.1 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x202001052000 00:04:04.893 EAL: PCI memory mapped at 0x202001053000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.1 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.2 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x202001054000 00:04:04.893 EAL: PCI memory mapped at 0x202001055000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.2 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.3 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x202001056000 00:04:04.893 EAL: PCI memory mapped at 0x202001057000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.3 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.4 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x202001058000 00:04:04.893 EAL: PCI memory mapped at 0x202001059000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.4 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.5 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x20200105a000 00:04:04.893 EAL: PCI memory mapped at 0x20200105b000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.5 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.6 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x20200105c000 00:04:04.893 EAL: PCI memory mapped at 0x20200105d000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.6 (socket 0) 00:04:04.893 EAL: PCI device 0000:0e:02.7 on NUMA socket 0 00:04:04.893 EAL: probe driver: 8086:37c9 qat 00:04:04.893 EAL: PCI memory mapped at 0x20200105e000 00:04:04.893 EAL: PCI memory mapped at 0x20200105f000 00:04:04.893 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.7 (socket 0) 00:04:04.893 EAL: No shared files mode enabled, IPC is disabled 00:04:04.893 EAL: No shared files mode enabled, IPC is disabled 00:04:04.893 EAL: No PCI address specified using 'addr=' in: bus=pci 00:04:04.893 EAL: Mem event callback 'spdk:(nil)' registered 00:04:05.151 00:04:05.151 00:04:05.151 CUnit - A unit testing framework for C - Version 2.1-3 00:04:05.151 http://cunit.sourceforge.net/ 00:04:05.151 00:04:05.151 00:04:05.151 Suite: components_suite 00:04:05.151 Test: vtophys_malloc_test ...passed 00:04:05.151 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:05.151 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.151 EAL: Restoring previous memory policy: 4 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was expanded by 4MB 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was shrunk by 4MB 00:04:05.151 EAL: Trying to obtain current memory policy. 00:04:05.151 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.151 EAL: Restoring previous memory policy: 4 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was expanded by 6MB 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was shrunk by 6MB 00:04:05.151 EAL: Trying to obtain current memory policy. 00:04:05.151 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.151 EAL: Restoring previous memory policy: 4 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was expanded by 10MB 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.151 EAL: request: mp_malloc_sync 00:04:05.151 EAL: No shared files mode enabled, IPC is disabled 00:04:05.151 EAL: Heap on socket 0 was shrunk by 10MB 00:04:05.151 EAL: Trying to obtain current memory policy. 00:04:05.151 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.151 EAL: Restoring previous memory policy: 4 00:04:05.151 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was expanded by 18MB 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was shrunk by 18MB 00:04:05.152 EAL: Trying to obtain current memory policy. 00:04:05.152 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.152 EAL: Restoring previous memory policy: 4 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was expanded by 34MB 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was shrunk by 34MB 00:04:05.152 EAL: Trying to obtain current memory policy. 00:04:05.152 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.152 EAL: Restoring previous memory policy: 4 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was expanded by 66MB 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was shrunk by 66MB 00:04:05.152 EAL: Trying to obtain current memory policy. 00:04:05.152 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.152 EAL: Restoring previous memory policy: 4 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was expanded by 130MB 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was shrunk by 130MB 00:04:05.152 EAL: Trying to obtain current memory policy. 00:04:05.152 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.152 EAL: Restoring previous memory policy: 4 00:04:05.152 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.152 EAL: request: mp_malloc_sync 00:04:05.152 EAL: No shared files mode enabled, IPC is disabled 00:04:05.152 EAL: Heap on socket 0 was expanded by 258MB 00:04:05.410 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.410 EAL: request: mp_malloc_sync 00:04:05.410 EAL: No shared files mode enabled, IPC is disabled 00:04:05.410 EAL: Heap on socket 0 was shrunk by 258MB 00:04:05.410 EAL: Trying to obtain current memory policy. 00:04:05.410 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.410 EAL: Restoring previous memory policy: 4 00:04:05.410 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.410 EAL: request: mp_malloc_sync 00:04:05.410 EAL: No shared files mode enabled, IPC is disabled 00:04:05.410 EAL: Heap on socket 0 was expanded by 514MB 00:04:05.667 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.667 EAL: request: mp_malloc_sync 00:04:05.667 EAL: No shared files mode enabled, IPC is disabled 00:04:05.667 EAL: Heap on socket 0 was shrunk by 514MB 00:04:05.667 EAL: Trying to obtain current memory policy. 00:04:05.667 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:05.924 EAL: Restoring previous memory policy: 4 00:04:05.924 EAL: Calling mem event callback 'spdk:(nil)' 00:04:05.924 EAL: request: mp_malloc_sync 00:04:05.924 EAL: No shared files mode enabled, IPC is disabled 00:04:05.924 EAL: Heap on socket 0 was expanded by 1026MB 00:04:06.181 EAL: Calling mem event callback 'spdk:(nil)' 00:04:06.440 EAL: request: mp_malloc_sync 00:04:06.440 EAL: No shared files mode enabled, IPC is disabled 00:04:06.440 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:06.440 passed 00:04:06.440 00:04:06.440 Run Summary: Type Total Ran Passed Failed Inactive 00:04:06.440 suites 1 1 n/a 0 0 00:04:06.440 tests 2 2 2 0 0 00:04:06.440 asserts 7535 7535 7535 0 n/a 00:04:06.440 00:04:06.440 Elapsed time = 1.386 seconds 00:04:06.440 EAL: No shared files mode enabled, IPC is disabled 00:04:06.440 EAL: No shared files mode enabled, IPC is disabled 00:04:06.440 EAL: No shared files mode enabled, IPC is disabled 00:04:06.440 00:04:06.440 real 0m1.530s 00:04:06.440 user 0m0.872s 00:04:06.440 sys 0m0.621s 00:04:06.440 04:04:54 env.env_vtophys -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:06.440 04:04:54 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:04:06.440 ************************************ 00:04:06.440 END TEST env_vtophys 00:04:06.440 ************************************ 00:04:06.440 04:04:54 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:04:06.440 04:04:54 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:06.440 04:04:54 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:06.440 04:04:54 env -- common/autotest_common.sh@10 -- # set +x 00:04:06.440 ************************************ 00:04:06.440 START TEST env_pci 00:04:06.440 ************************************ 00:04:06.440 04:04:54 env.env_pci -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:04:06.440 00:04:06.440 00:04:06.440 CUnit - A unit testing framework for C - Version 2.1-3 00:04:06.440 http://cunit.sourceforge.net/ 00:04:06.440 00:04:06.440 00:04:06.440 Suite: pci 00:04:06.440 Test: pci_hook ...[2024-05-15 04:04:54.404779] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3787854 has claimed it 00:04:06.440 EAL: Cannot find device (10000:00:01.0) 00:04:06.440 EAL: Failed to attach device on primary process 00:04:06.440 passed 00:04:06.440 00:04:06.440 Run Summary: Type Total Ran Passed Failed Inactive 00:04:06.440 suites 1 1 n/a 0 0 00:04:06.440 tests 1 1 1 0 0 00:04:06.440 asserts 25 25 25 0 n/a 00:04:06.440 00:04:06.440 Elapsed time = 0.026 seconds 00:04:06.440 00:04:06.440 real 0m0.042s 00:04:06.440 user 0m0.009s 00:04:06.440 sys 0m0.033s 00:04:06.441 04:04:54 env.env_pci -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:06.441 04:04:54 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:04:06.441 ************************************ 00:04:06.441 END TEST env_pci 00:04:06.441 ************************************ 00:04:06.441 04:04:54 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:06.441 04:04:54 env -- env/env.sh@15 -- # uname 00:04:06.441 04:04:54 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:06.441 04:04:54 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:06.700 04:04:54 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:06.700 04:04:54 env -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:04:06.700 04:04:54 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:06.700 04:04:54 env -- common/autotest_common.sh@10 -- # set +x 00:04:06.700 ************************************ 00:04:06.700 START TEST env_dpdk_post_init 00:04:06.700 ************************************ 00:04:06.700 04:04:54 env.env_dpdk_post_init -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:06.700 EAL: Detected CPU lcores: 48 00:04:06.700 EAL: Detected NUMA nodes: 2 00:04:06.700 EAL: Detected shared linkage of DPDK 00:04:06.700 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:06.700 EAL: Selected IOVA mode 'PA' 00:04:06.700 EAL: VFIO support initialized 00:04:06.700 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.0 (socket 0) 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.0_qat_sym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.0_qat_asym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.700 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.1 (socket 0) 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.1_qat_sym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.1_qat_asym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.700 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.2 (socket 0) 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.2_qat_sym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.700 CRYPTODEV: Creating cryptodev 0000:0c:01.2_qat_asym 00:04:06.700 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.3 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.3_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.3_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.4 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.4_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.4_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.5 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.5_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.5_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.6 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.6_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.6_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.7 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.7_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:01.7_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.0 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.0_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.0_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.1 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.1_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.1_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.2 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.2_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.2_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.3 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.3_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.3_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.4 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.4_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.4_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.5 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.5_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.5_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.6 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.6_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.6_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.7 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.7_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0c:02.7_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.0 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.0_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.0_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.1 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.1_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.1_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.2 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.2_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.2_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.3 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.3_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.3_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.4 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.4_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.4_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.5 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.5_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.5_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.6 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.6_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.6_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.7 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.7_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:01.7_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.0 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.0_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.0_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.1 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.1_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.1_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.2 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.2_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.2_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.3 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.3_qat_sym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.3_qat_asym 00:04:06.701 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.701 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.4 (socket 0) 00:04:06.701 CRYPTODEV: Creating cryptodev 0000:0d:02.4_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.4_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.5 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.5_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.5_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.6 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.6_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.6_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.7 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.7_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0d:02.7_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.0 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.0_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.0_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.1 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.1_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.1_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.2 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.2_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.2_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.3 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.3_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.3_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.4 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.4_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.4_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.5 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.5_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.5_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.6 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.6_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.6_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.7 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.7_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:01.7_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.0 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.0_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.0_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.1 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.1_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.1_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.2 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.2_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.2_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.3 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.3_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.3_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.4 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.4_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.4_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.5 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.5_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.5_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.6 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.6_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.6_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.7 (socket 0) 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.7_qat_sym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:06.702 CRYPTODEV: Creating cryptodev 0000:0e:02.7_qat_asym 00:04:06.702 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:06.702 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:06.702 EAL: Using IOMMU type 1 (Type 1) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e20) device: 0000:00:04.0 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e21) device: 0000:00:04.1 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e22) device: 0000:00:04.2 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e23) device: 0000:00:04.3 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e24) device: 0000:00:04.4 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e25) device: 0000:00:04.5 (socket 0) 00:04:06.702 EAL: Probe PCI driver: spdk_ioat (8086:0e26) device: 0000:00:04.6 (socket 0) 00:04:06.960 EAL: Probe PCI driver: spdk_ioat (8086:0e27) device: 0000:00:04.7 (socket 0) 00:04:06.960 EAL: Probe PCI driver: spdk_ioat (8086:0e20) device: 0000:80:04.0 (socket 1) 00:04:06.960 EAL: Probe PCI driver: spdk_ioat (8086:0e21) device: 0000:80:04.1 (socket 1) 00:04:06.960 EAL: Probe PCI driver: spdk_ioat (8086:0e22) device: 0000:80:04.2 (socket 1) 00:04:06.961 EAL: Probe PCI driver: spdk_ioat (8086:0e23) device: 0000:80:04.3 (socket 1) 00:04:06.961 EAL: Probe PCI driver: spdk_ioat (8086:0e24) device: 0000:80:04.4 (socket 1) 00:04:06.961 EAL: Probe PCI driver: spdk_ioat (8086:0e25) device: 0000:80:04.5 (socket 1) 00:04:06.961 EAL: Probe PCI driver: spdk_ioat (8086:0e26) device: 0000:80:04.6 (socket 1) 00:04:06.961 EAL: Probe PCI driver: spdk_ioat (8086:0e27) device: 0000:80:04.7 (socket 1) 00:04:07.895 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:81:00.0 (socket 1) 00:04:12.080 EAL: Releasing PCI mapped resource for 0000:81:00.0 00:04:12.080 EAL: Calling pci_unmap_resource for 0000:81:00.0 at 0x2020010a0000 00:04:12.080 Starting DPDK initialization... 00:04:12.080 Starting SPDK post initialization... 00:04:12.080 SPDK NVMe probe 00:04:12.080 Attaching to 0000:81:00.0 00:04:12.080 Attached to 0000:81:00.0 00:04:12.080 Cleaning up... 00:04:12.080 00:04:12.080 real 0m5.306s 00:04:12.080 user 0m4.037s 00:04:12.080 sys 0m0.325s 00:04:12.080 04:04:59 env.env_dpdk_post_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:12.080 04:04:59 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:04:12.080 ************************************ 00:04:12.080 END TEST env_dpdk_post_init 00:04:12.080 ************************************ 00:04:12.080 04:04:59 env -- env/env.sh@26 -- # uname 00:04:12.080 04:04:59 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:12.080 04:04:59 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:12.080 04:04:59 env -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:12.080 04:04:59 env -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:12.080 04:04:59 env -- common/autotest_common.sh@10 -- # set +x 00:04:12.080 ************************************ 00:04:12.080 START TEST env_mem_callbacks 00:04:12.080 ************************************ 00:04:12.080 04:04:59 env.env_mem_callbacks -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:12.080 EAL: Detected CPU lcores: 48 00:04:12.080 EAL: Detected NUMA nodes: 2 00:04:12.080 EAL: Detected shared linkage of DPDK 00:04:12.080 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:12.080 EAL: Selected IOVA mode 'PA' 00:04:12.080 EAL: VFIO support initialized 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.0 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.0_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.0_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.1 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.1_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.1_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.2 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.2_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.2_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.3 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.3_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.3_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.4 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.4_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.4_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.5 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.5_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.5_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.6 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.6_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.6_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:01.7 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.7_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:01.7_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.0 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.0_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.0_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.1 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.1_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.1_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.2 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.2_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.2_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.3 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.3_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.3_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.4 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.4_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.4_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.5 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.5_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.5_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.6 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.6_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.6_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0c:02.7 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.7_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0c:02.7_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.0 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0d:01.0_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0d:01.0_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.1 (socket 0) 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0d:01.1_qat_sym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.080 CRYPTODEV: Creating cryptodev 0000:0d:01.1_qat_asym 00:04:12.080 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.080 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.2 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.2_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.2_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.3 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.3_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.3_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.4 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.4_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.4_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.5 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.5_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.5_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.6 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.6_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.6_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:01.7 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.7_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:01.7_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.0 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.0_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.0_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.1 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.1_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.1_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.2 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.2_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.2_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.3 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.3_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.3_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.4 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.4_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.4_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.5 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.5_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.5_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.6 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.6_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.6_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0d:02.7 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.7_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0d:02.7_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.0 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.0_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.0_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.1 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.1_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.1_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.2 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.2_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.2_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.3 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.3_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.3_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.4 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.4_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.4_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.5 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.5_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.5_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.6 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.6_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.6_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:01.7 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.7_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:01.7_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.0 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.0_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.0_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.1 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.1_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.1_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.2 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.2_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.2_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.3 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.3_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.3_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.4 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.4_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.4_qat_asym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.081 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.5 (socket 0) 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.5_qat_sym 00:04:12.081 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.081 CRYPTODEV: Creating cryptodev 0000:0e:02.5_qat_asym 00:04:12.082 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.082 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.6 (socket 0) 00:04:12.082 CRYPTODEV: Creating cryptodev 0000:0e:02.6_qat_sym 00:04:12.082 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.082 CRYPTODEV: Creating cryptodev 0000:0e:02.6_qat_asym 00:04:12.082 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.082 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:0e:02.7 (socket 0) 00:04:12.082 CRYPTODEV: Creating cryptodev 0000:0e:02.7_qat_sym 00:04:12.082 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:04:12.082 CRYPTODEV: Creating cryptodev 0000:0e:02.7_qat_asym 00:04:12.082 CRYPTODEV: Initialisation parameters - name: 0000:0e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:04:12.082 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:12.082 00:04:12.082 00:04:12.082 CUnit - A unit testing framework for C - Version 2.1-3 00:04:12.082 http://cunit.sourceforge.net/ 00:04:12.082 00:04:12.082 00:04:12.082 Suite: memory 00:04:12.082 Test: test ... 00:04:12.082 register 0x200000200000 2097152 00:04:12.082 malloc 3145728 00:04:12.082 register 0x200000400000 4194304 00:04:12.082 buf 0x200000500000 len 3145728 PASSED 00:04:12.082 malloc 64 00:04:12.082 buf 0x2000004fff40 len 64 PASSED 00:04:12.082 malloc 4194304 00:04:12.082 register 0x200000800000 6291456 00:04:12.082 buf 0x200000a00000 len 4194304 PASSED 00:04:12.082 free 0x200000500000 3145728 00:04:12.082 free 0x2000004fff40 64 00:04:12.082 unregister 0x200000400000 4194304 PASSED 00:04:12.082 free 0x200000a00000 4194304 00:04:12.082 unregister 0x200000800000 6291456 PASSED 00:04:12.082 malloc 8388608 00:04:12.082 register 0x200000400000 10485760 00:04:12.082 buf 0x200000600000 len 8388608 PASSED 00:04:12.082 free 0x200000600000 8388608 00:04:12.082 unregister 0x200000400000 10485760 PASSED 00:04:12.082 passed 00:04:12.082 00:04:12.082 Run Summary: Type Total Ran Passed Failed Inactive 00:04:12.082 suites 1 1 n/a 0 0 00:04:12.082 tests 1 1 1 0 0 00:04:12.082 asserts 15 15 15 0 n/a 00:04:12.082 00:04:12.082 Elapsed time = 0.006 seconds 00:04:12.082 00:04:12.082 real 0m0.069s 00:04:12.082 user 0m0.025s 00:04:12.082 sys 0m0.043s 00:04:12.082 04:04:59 env.env_mem_callbacks -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:12.082 04:04:59 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:04:12.082 ************************************ 00:04:12.082 END TEST env_mem_callbacks 00:04:12.082 ************************************ 00:04:12.082 00:04:12.082 real 0m7.414s 00:04:12.082 user 0m5.215s 00:04:12.082 sys 0m1.223s 00:04:12.082 04:04:59 env -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:12.082 04:04:59 env -- common/autotest_common.sh@10 -- # set +x 00:04:12.082 ************************************ 00:04:12.082 END TEST env 00:04:12.082 ************************************ 00:04:12.082 04:04:59 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:04:12.082 04:04:59 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:12.082 04:04:59 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:12.082 04:04:59 -- common/autotest_common.sh@10 -- # set +x 00:04:12.082 ************************************ 00:04:12.082 START TEST rpc 00:04:12.082 ************************************ 00:04:12.082 04:04:59 rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:04:12.082 * Looking for test storage... 00:04:12.082 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:04:12.082 04:05:00 rpc -- rpc/rpc.sh@65 -- # spdk_pid=3788641 00:04:12.082 04:05:00 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:12.082 04:05:00 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:12.082 04:05:00 rpc -- rpc/rpc.sh@67 -- # waitforlisten 3788641 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@827 -- # '[' -z 3788641 ']' 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:12.082 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:04:12.082 04:05:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:12.082 [2024-05-15 04:05:00.089345] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:12.082 [2024-05-15 04:05:00.089430] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3788641 ] 00:04:12.341 [2024-05-15 04:05:00.167365] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:12.341 [2024-05-15 04:05:00.277145] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:12.341 [2024-05-15 04:05:00.277216] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3788641' to capture a snapshot of events at runtime. 00:04:12.341 [2024-05-15 04:05:00.277233] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:04:12.341 [2024-05-15 04:05:00.277247] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:04:12.341 [2024-05-15 04:05:00.277260] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3788641 for offline analysis/debug. 00:04:12.341 [2024-05-15 04:05:00.277305] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:13.271 04:05:00 rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:04:13.271 04:05:00 rpc -- common/autotest_common.sh@860 -- # return 0 00:04:13.271 04:05:00 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:04:13.271 04:05:00 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:04:13.271 04:05:00 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:13.271 04:05:00 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:13.271 04:05:00 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:13.272 04:05:00 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:13.272 04:05:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 ************************************ 00:04:13.272 START TEST rpc_integrity 00:04:13.272 ************************************ 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:13.272 { 00:04:13.272 "name": "Malloc0", 00:04:13.272 "aliases": [ 00:04:13.272 "27882fdc-3c8f-4218-a6ba-909980d978d8" 00:04:13.272 ], 00:04:13.272 "product_name": "Malloc disk", 00:04:13.272 "block_size": 512, 00:04:13.272 "num_blocks": 16384, 00:04:13.272 "uuid": "27882fdc-3c8f-4218-a6ba-909980d978d8", 00:04:13.272 "assigned_rate_limits": { 00:04:13.272 "rw_ios_per_sec": 0, 00:04:13.272 "rw_mbytes_per_sec": 0, 00:04:13.272 "r_mbytes_per_sec": 0, 00:04:13.272 "w_mbytes_per_sec": 0 00:04:13.272 }, 00:04:13.272 "claimed": false, 00:04:13.272 "zoned": false, 00:04:13.272 "supported_io_types": { 00:04:13.272 "read": true, 00:04:13.272 "write": true, 00:04:13.272 "unmap": true, 00:04:13.272 "write_zeroes": true, 00:04:13.272 "flush": true, 00:04:13.272 "reset": true, 00:04:13.272 "compare": false, 00:04:13.272 "compare_and_write": false, 00:04:13.272 "abort": true, 00:04:13.272 "nvme_admin": false, 00:04:13.272 "nvme_io": false 00:04:13.272 }, 00:04:13.272 "memory_domains": [ 00:04:13.272 { 00:04:13.272 "dma_device_id": "system", 00:04:13.272 "dma_device_type": 1 00:04:13.272 }, 00:04:13.272 { 00:04:13.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:13.272 "dma_device_type": 2 00:04:13.272 } 00:04:13.272 ], 00:04:13.272 "driver_specific": {} 00:04:13.272 } 00:04:13.272 ]' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 [2024-05-15 04:05:01.145670] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:13.272 [2024-05-15 04:05:01.145715] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:13.272 [2024-05-15 04:05:01.145740] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x161efd0 00:04:13.272 [2024-05-15 04:05:01.145770] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:13.272 [2024-05-15 04:05:01.147265] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:13.272 [2024-05-15 04:05:01.147293] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:13.272 Passthru0 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:13.272 { 00:04:13.272 "name": "Malloc0", 00:04:13.272 "aliases": [ 00:04:13.272 "27882fdc-3c8f-4218-a6ba-909980d978d8" 00:04:13.272 ], 00:04:13.272 "product_name": "Malloc disk", 00:04:13.272 "block_size": 512, 00:04:13.272 "num_blocks": 16384, 00:04:13.272 "uuid": "27882fdc-3c8f-4218-a6ba-909980d978d8", 00:04:13.272 "assigned_rate_limits": { 00:04:13.272 "rw_ios_per_sec": 0, 00:04:13.272 "rw_mbytes_per_sec": 0, 00:04:13.272 "r_mbytes_per_sec": 0, 00:04:13.272 "w_mbytes_per_sec": 0 00:04:13.272 }, 00:04:13.272 "claimed": true, 00:04:13.272 "claim_type": "exclusive_write", 00:04:13.272 "zoned": false, 00:04:13.272 "supported_io_types": { 00:04:13.272 "read": true, 00:04:13.272 "write": true, 00:04:13.272 "unmap": true, 00:04:13.272 "write_zeroes": true, 00:04:13.272 "flush": true, 00:04:13.272 "reset": true, 00:04:13.272 "compare": false, 00:04:13.272 "compare_and_write": false, 00:04:13.272 "abort": true, 00:04:13.272 "nvme_admin": false, 00:04:13.272 "nvme_io": false 00:04:13.272 }, 00:04:13.272 "memory_domains": [ 00:04:13.272 { 00:04:13.272 "dma_device_id": "system", 00:04:13.272 "dma_device_type": 1 00:04:13.272 }, 00:04:13.272 { 00:04:13.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:13.272 "dma_device_type": 2 00:04:13.272 } 00:04:13.272 ], 00:04:13.272 "driver_specific": {} 00:04:13.272 }, 00:04:13.272 { 00:04:13.272 "name": "Passthru0", 00:04:13.272 "aliases": [ 00:04:13.272 "de56cd8d-3628-59ff-9e48-8e432c5cd321" 00:04:13.272 ], 00:04:13.272 "product_name": "passthru", 00:04:13.272 "block_size": 512, 00:04:13.272 "num_blocks": 16384, 00:04:13.272 "uuid": "de56cd8d-3628-59ff-9e48-8e432c5cd321", 00:04:13.272 "assigned_rate_limits": { 00:04:13.272 "rw_ios_per_sec": 0, 00:04:13.272 "rw_mbytes_per_sec": 0, 00:04:13.272 "r_mbytes_per_sec": 0, 00:04:13.272 "w_mbytes_per_sec": 0 00:04:13.272 }, 00:04:13.272 "claimed": false, 00:04:13.272 "zoned": false, 00:04:13.272 "supported_io_types": { 00:04:13.272 "read": true, 00:04:13.272 "write": true, 00:04:13.272 "unmap": true, 00:04:13.272 "write_zeroes": true, 00:04:13.272 "flush": true, 00:04:13.272 "reset": true, 00:04:13.272 "compare": false, 00:04:13.272 "compare_and_write": false, 00:04:13.272 "abort": true, 00:04:13.272 "nvme_admin": false, 00:04:13.272 "nvme_io": false 00:04:13.272 }, 00:04:13.272 "memory_domains": [ 00:04:13.272 { 00:04:13.272 "dma_device_id": "system", 00:04:13.272 "dma_device_type": 1 00:04:13.272 }, 00:04:13.272 { 00:04:13.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:13.272 "dma_device_type": 2 00:04:13.272 } 00:04:13.272 ], 00:04:13.272 "driver_specific": { 00:04:13.272 "passthru": { 00:04:13.272 "name": "Passthru0", 00:04:13.272 "base_bdev_name": "Malloc0" 00:04:13.272 } 00:04:13.272 } 00:04:13.272 } 00:04:13.272 ]' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:04:13.272 04:05:01 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:13.272 00:04:13.272 real 0m0.231s 00:04:13.272 user 0m0.151s 00:04:13.272 sys 0m0.023s 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:13.272 04:05:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.272 ************************************ 00:04:13.272 END TEST rpc_integrity 00:04:13.272 ************************************ 00:04:13.528 04:05:01 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:13.528 04:05:01 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:13.529 04:05:01 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:13.529 04:05:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 ************************************ 00:04:13.529 START TEST rpc_plugins 00:04:13.529 ************************************ 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@1121 -- # rpc_plugins 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:13.529 { 00:04:13.529 "name": "Malloc1", 00:04:13.529 "aliases": [ 00:04:13.529 "a828a0d4-1219-42e1-8c98-325ebf147141" 00:04:13.529 ], 00:04:13.529 "product_name": "Malloc disk", 00:04:13.529 "block_size": 4096, 00:04:13.529 "num_blocks": 256, 00:04:13.529 "uuid": "a828a0d4-1219-42e1-8c98-325ebf147141", 00:04:13.529 "assigned_rate_limits": { 00:04:13.529 "rw_ios_per_sec": 0, 00:04:13.529 "rw_mbytes_per_sec": 0, 00:04:13.529 "r_mbytes_per_sec": 0, 00:04:13.529 "w_mbytes_per_sec": 0 00:04:13.529 }, 00:04:13.529 "claimed": false, 00:04:13.529 "zoned": false, 00:04:13.529 "supported_io_types": { 00:04:13.529 "read": true, 00:04:13.529 "write": true, 00:04:13.529 "unmap": true, 00:04:13.529 "write_zeroes": true, 00:04:13.529 "flush": true, 00:04:13.529 "reset": true, 00:04:13.529 "compare": false, 00:04:13.529 "compare_and_write": false, 00:04:13.529 "abort": true, 00:04:13.529 "nvme_admin": false, 00:04:13.529 "nvme_io": false 00:04:13.529 }, 00:04:13.529 "memory_domains": [ 00:04:13.529 { 00:04:13.529 "dma_device_id": "system", 00:04:13.529 "dma_device_type": 1 00:04:13.529 }, 00:04:13.529 { 00:04:13.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:13.529 "dma_device_type": 2 00:04:13.529 } 00:04:13.529 ], 00:04:13.529 "driver_specific": {} 00:04:13.529 } 00:04:13.529 ]' 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:04:13.529 04:05:01 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:13.529 00:04:13.529 real 0m0.112s 00:04:13.529 user 0m0.071s 00:04:13.529 sys 0m0.013s 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 ************************************ 00:04:13.529 END TEST rpc_plugins 00:04:13.529 ************************************ 00:04:13.529 04:05:01 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:13.529 04:05:01 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:13.529 04:05:01 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:13.529 04:05:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 ************************************ 00:04:13.529 START TEST rpc_trace_cmd_test 00:04:13.529 ************************************ 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1121 -- # rpc_trace_cmd_test 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:04:13.529 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3788641", 00:04:13.529 "tpoint_group_mask": "0x8", 00:04:13.529 "iscsi_conn": { 00:04:13.529 "mask": "0x2", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "scsi": { 00:04:13.529 "mask": "0x4", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "bdev": { 00:04:13.529 "mask": "0x8", 00:04:13.529 "tpoint_mask": "0xffffffffffffffff" 00:04:13.529 }, 00:04:13.529 "nvmf_rdma": { 00:04:13.529 "mask": "0x10", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "nvmf_tcp": { 00:04:13.529 "mask": "0x20", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "ftl": { 00:04:13.529 "mask": "0x40", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "blobfs": { 00:04:13.529 "mask": "0x80", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "dsa": { 00:04:13.529 "mask": "0x200", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "thread": { 00:04:13.529 "mask": "0x400", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "nvme_pcie": { 00:04:13.529 "mask": "0x800", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "iaa": { 00:04:13.529 "mask": "0x1000", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "nvme_tcp": { 00:04:13.529 "mask": "0x2000", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "bdev_nvme": { 00:04:13.529 "mask": "0x4000", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 }, 00:04:13.529 "sock": { 00:04:13.529 "mask": "0x8000", 00:04:13.529 "tpoint_mask": "0x0" 00:04:13.529 } 00:04:13.529 }' 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:04:13.529 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:13.785 00:04:13.785 real 0m0.197s 00:04:13.785 user 0m0.172s 00:04:13.785 sys 0m0.015s 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:13.785 04:05:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:04:13.785 ************************************ 00:04:13.785 END TEST rpc_trace_cmd_test 00:04:13.785 ************************************ 00:04:13.785 04:05:01 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:13.785 04:05:01 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:13.785 04:05:01 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:13.785 04:05:01 rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:13.785 04:05:01 rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:13.785 04:05:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:13.785 ************************************ 00:04:13.785 START TEST rpc_daemon_integrity 00:04:13.785 ************************************ 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1121 -- # rpc_integrity 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:13.785 { 00:04:13.785 "name": "Malloc2", 00:04:13.785 "aliases": [ 00:04:13.785 "0ebae000-63fd-4aa2-83c0-b4ae2029d0c4" 00:04:13.785 ], 00:04:13.785 "product_name": "Malloc disk", 00:04:13.785 "block_size": 512, 00:04:13.785 "num_blocks": 16384, 00:04:13.785 "uuid": "0ebae000-63fd-4aa2-83c0-b4ae2029d0c4", 00:04:13.785 "assigned_rate_limits": { 00:04:13.785 "rw_ios_per_sec": 0, 00:04:13.785 "rw_mbytes_per_sec": 0, 00:04:13.785 "r_mbytes_per_sec": 0, 00:04:13.785 "w_mbytes_per_sec": 0 00:04:13.785 }, 00:04:13.785 "claimed": false, 00:04:13.785 "zoned": false, 00:04:13.785 "supported_io_types": { 00:04:13.785 "read": true, 00:04:13.785 "write": true, 00:04:13.785 "unmap": true, 00:04:13.785 "write_zeroes": true, 00:04:13.785 "flush": true, 00:04:13.785 "reset": true, 00:04:13.785 "compare": false, 00:04:13.785 "compare_and_write": false, 00:04:13.785 "abort": true, 00:04:13.785 "nvme_admin": false, 00:04:13.785 "nvme_io": false 00:04:13.785 }, 00:04:13.785 "memory_domains": [ 00:04:13.785 { 00:04:13.785 "dma_device_id": "system", 00:04:13.785 "dma_device_type": 1 00:04:13.785 }, 00:04:13.785 { 00:04:13.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:13.785 "dma_device_type": 2 00:04:13.785 } 00:04:13.785 ], 00:04:13.785 "driver_specific": {} 00:04:13.785 } 00:04:13.785 ]' 00:04:13.785 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 [2024-05-15 04:05:01.835606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:14.042 [2024-05-15 04:05:01.835649] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:14.042 [2024-05-15 04:05:01.835673] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17c2fa0 00:04:14.042 [2024-05-15 04:05:01.835704] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:14.042 [2024-05-15 04:05:01.837050] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:14.042 [2024-05-15 04:05:01.837076] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:14.042 Passthru0 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:14.042 { 00:04:14.042 "name": "Malloc2", 00:04:14.042 "aliases": [ 00:04:14.042 "0ebae000-63fd-4aa2-83c0-b4ae2029d0c4" 00:04:14.042 ], 00:04:14.042 "product_name": "Malloc disk", 00:04:14.042 "block_size": 512, 00:04:14.042 "num_blocks": 16384, 00:04:14.042 "uuid": "0ebae000-63fd-4aa2-83c0-b4ae2029d0c4", 00:04:14.042 "assigned_rate_limits": { 00:04:14.042 "rw_ios_per_sec": 0, 00:04:14.042 "rw_mbytes_per_sec": 0, 00:04:14.042 "r_mbytes_per_sec": 0, 00:04:14.042 "w_mbytes_per_sec": 0 00:04:14.042 }, 00:04:14.042 "claimed": true, 00:04:14.042 "claim_type": "exclusive_write", 00:04:14.042 "zoned": false, 00:04:14.042 "supported_io_types": { 00:04:14.042 "read": true, 00:04:14.042 "write": true, 00:04:14.042 "unmap": true, 00:04:14.042 "write_zeroes": true, 00:04:14.042 "flush": true, 00:04:14.042 "reset": true, 00:04:14.042 "compare": false, 00:04:14.042 "compare_and_write": false, 00:04:14.042 "abort": true, 00:04:14.042 "nvme_admin": false, 00:04:14.042 "nvme_io": false 00:04:14.042 }, 00:04:14.042 "memory_domains": [ 00:04:14.042 { 00:04:14.042 "dma_device_id": "system", 00:04:14.042 "dma_device_type": 1 00:04:14.042 }, 00:04:14.042 { 00:04:14.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:14.042 "dma_device_type": 2 00:04:14.042 } 00:04:14.042 ], 00:04:14.042 "driver_specific": {} 00:04:14.042 }, 00:04:14.042 { 00:04:14.042 "name": "Passthru0", 00:04:14.042 "aliases": [ 00:04:14.042 "3be23c68-03b3-517f-8d0d-f42a04335ada" 00:04:14.042 ], 00:04:14.042 "product_name": "passthru", 00:04:14.042 "block_size": 512, 00:04:14.042 "num_blocks": 16384, 00:04:14.042 "uuid": "3be23c68-03b3-517f-8d0d-f42a04335ada", 00:04:14.042 "assigned_rate_limits": { 00:04:14.042 "rw_ios_per_sec": 0, 00:04:14.042 "rw_mbytes_per_sec": 0, 00:04:14.042 "r_mbytes_per_sec": 0, 00:04:14.042 "w_mbytes_per_sec": 0 00:04:14.042 }, 00:04:14.042 "claimed": false, 00:04:14.042 "zoned": false, 00:04:14.042 "supported_io_types": { 00:04:14.042 "read": true, 00:04:14.042 "write": true, 00:04:14.042 "unmap": true, 00:04:14.042 "write_zeroes": true, 00:04:14.042 "flush": true, 00:04:14.042 "reset": true, 00:04:14.042 "compare": false, 00:04:14.042 "compare_and_write": false, 00:04:14.042 "abort": true, 00:04:14.042 "nvme_admin": false, 00:04:14.042 "nvme_io": false 00:04:14.042 }, 00:04:14.042 "memory_domains": [ 00:04:14.042 { 00:04:14.042 "dma_device_id": "system", 00:04:14.042 "dma_device_type": 1 00:04:14.042 }, 00:04:14.042 { 00:04:14.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:14.042 "dma_device_type": 2 00:04:14.042 } 00:04:14.042 ], 00:04:14.042 "driver_specific": { 00:04:14.042 "passthru": { 00:04:14.042 "name": "Passthru0", 00:04:14.042 "base_bdev_name": "Malloc2" 00:04:14.042 } 00:04:14.042 } 00:04:14.042 } 00:04:14.042 ]' 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:14.042 00:04:14.042 real 0m0.228s 00:04:14.042 user 0m0.153s 00:04:14.042 sys 0m0.019s 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:14.042 04:05:01 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:04:14.042 ************************************ 00:04:14.042 END TEST rpc_daemon_integrity 00:04:14.042 ************************************ 00:04:14.042 04:05:01 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:14.042 04:05:01 rpc -- rpc/rpc.sh@84 -- # killprocess 3788641 00:04:14.042 04:05:01 rpc -- common/autotest_common.sh@946 -- # '[' -z 3788641 ']' 00:04:14.043 04:05:01 rpc -- common/autotest_common.sh@950 -- # kill -0 3788641 00:04:14.043 04:05:01 rpc -- common/autotest_common.sh@951 -- # uname 00:04:14.043 04:05:01 rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:14.043 04:05:01 rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3788641 00:04:14.043 04:05:02 rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:14.043 04:05:02 rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:14.043 04:05:02 rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3788641' 00:04:14.043 killing process with pid 3788641 00:04:14.043 04:05:02 rpc -- common/autotest_common.sh@965 -- # kill 3788641 00:04:14.043 04:05:02 rpc -- common/autotest_common.sh@970 -- # wait 3788641 00:04:14.606 00:04:14.606 real 0m2.481s 00:04:14.606 user 0m3.123s 00:04:14.606 sys 0m0.643s 00:04:14.606 04:05:02 rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:14.606 04:05:02 rpc -- common/autotest_common.sh@10 -- # set +x 00:04:14.606 ************************************ 00:04:14.606 END TEST rpc 00:04:14.606 ************************************ 00:04:14.606 04:05:02 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:04:14.606 04:05:02 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:14.606 04:05:02 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:14.606 04:05:02 -- common/autotest_common.sh@10 -- # set +x 00:04:14.606 ************************************ 00:04:14.606 START TEST skip_rpc 00:04:14.606 ************************************ 00:04:14.606 04:05:02 skip_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:04:14.606 * Looking for test storage... 00:04:14.607 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:04:14.607 04:05:02 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:04:14.607 04:05:02 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:04:14.607 04:05:02 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:04:14.607 04:05:02 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:14.607 04:05:02 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:14.607 04:05:02 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:14.607 ************************************ 00:04:14.607 START TEST skip_rpc 00:04:14.607 ************************************ 00:04:14.607 04:05:02 skip_rpc.skip_rpc -- common/autotest_common.sh@1121 -- # test_skip_rpc 00:04:14.607 04:05:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=3789084 00:04:14.607 04:05:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:04:14.607 04:05:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:14.607 04:05:02 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:04:14.864 [2024-05-15 04:05:02.667280] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:14.864 [2024-05-15 04:05:02.667366] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3789084 ] 00:04:14.864 [2024-05-15 04:05:02.752619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:14.864 [2024-05-15 04:05:02.873040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 3789084 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@946 -- # '[' -z 3789084 ']' 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # kill -0 3789084 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # uname 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3789084 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3789084' 00:04:20.125 killing process with pid 3789084 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@965 -- # kill 3789084 00:04:20.125 04:05:07 skip_rpc.skip_rpc -- common/autotest_common.sh@970 -- # wait 3789084 00:04:20.125 00:04:20.125 real 0m5.510s 00:04:20.125 user 0m5.166s 00:04:20.125 sys 0m0.339s 00:04:20.125 04:05:08 skip_rpc.skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:20.125 04:05:08 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:20.125 ************************************ 00:04:20.125 END TEST skip_rpc 00:04:20.125 ************************************ 00:04:20.125 04:05:08 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:04:20.125 04:05:08 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:20.125 04:05:08 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:20.125 04:05:08 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:20.384 ************************************ 00:04:20.384 START TEST skip_rpc_with_json 00:04:20.384 ************************************ 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_json 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=3789777 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 3789777 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@827 -- # '[' -z 3789777 ']' 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@832 -- # local max_retries=100 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:20.384 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # xtrace_disable 00:04:20.384 04:05:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:04:20.384 [2024-05-15 04:05:08.221698] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:20.384 [2024-05-15 04:05:08.221776] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3789777 ] 00:04:20.384 [2024-05-15 04:05:08.303890] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:20.643 [2024-05-15 04:05:08.419782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:21.209 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:04:21.209 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # return 0 00:04:21.209 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:04:21.209 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:21.209 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:04:21.209 [2024-05-15 04:05:09.139813] nvmf_rpc.c:2547:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:04:21.210 request: 00:04:21.210 { 00:04:21.210 "trtype": "tcp", 00:04:21.210 "method": "nvmf_get_transports", 00:04:21.210 "req_id": 1 00:04:21.210 } 00:04:21.210 Got JSON-RPC error response 00:04:21.210 response: 00:04:21.210 { 00:04:21.210 "code": -19, 00:04:21.210 "message": "No such device" 00:04:21.210 } 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:04:21.210 [2024-05-15 04:05:09.147942] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:04:21.210 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:04:21.507 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:04:21.507 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:04:21.507 { 00:04:21.507 "subsystems": [ 00:04:21.507 { 00:04:21.507 "subsystem": "keyring", 00:04:21.507 "config": [] 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "subsystem": "iobuf", 00:04:21.507 "config": [ 00:04:21.507 { 00:04:21.507 "method": "iobuf_set_options", 00:04:21.507 "params": { 00:04:21.507 "small_pool_count": 8192, 00:04:21.507 "large_pool_count": 1024, 00:04:21.507 "small_bufsize": 8192, 00:04:21.507 "large_bufsize": 135168 00:04:21.507 } 00:04:21.507 } 00:04:21.507 ] 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "subsystem": "sock", 00:04:21.507 "config": [ 00:04:21.507 { 00:04:21.507 "method": "sock_impl_set_options", 00:04:21.507 "params": { 00:04:21.507 "impl_name": "posix", 00:04:21.507 "recv_buf_size": 2097152, 00:04:21.507 "send_buf_size": 2097152, 00:04:21.507 "enable_recv_pipe": true, 00:04:21.507 "enable_quickack": false, 00:04:21.507 "enable_placement_id": 0, 00:04:21.507 "enable_zerocopy_send_server": true, 00:04:21.507 "enable_zerocopy_send_client": false, 00:04:21.507 "zerocopy_threshold": 0, 00:04:21.507 "tls_version": 0, 00:04:21.507 "enable_ktls": false 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "method": "sock_impl_set_options", 00:04:21.507 "params": { 00:04:21.507 "impl_name": "ssl", 00:04:21.507 "recv_buf_size": 4096, 00:04:21.507 "send_buf_size": 4096, 00:04:21.507 "enable_recv_pipe": true, 00:04:21.507 "enable_quickack": false, 00:04:21.507 "enable_placement_id": 0, 00:04:21.507 "enable_zerocopy_send_server": true, 00:04:21.507 "enable_zerocopy_send_client": false, 00:04:21.507 "zerocopy_threshold": 0, 00:04:21.507 "tls_version": 0, 00:04:21.507 "enable_ktls": false 00:04:21.507 } 00:04:21.507 } 00:04:21.507 ] 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "subsystem": "vmd", 00:04:21.507 "config": [] 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "subsystem": "accel", 00:04:21.507 "config": [ 00:04:21.507 { 00:04:21.507 "method": "accel_set_options", 00:04:21.507 "params": { 00:04:21.507 "small_cache_size": 128, 00:04:21.507 "large_cache_size": 16, 00:04:21.507 "task_count": 2048, 00:04:21.507 "sequence_count": 2048, 00:04:21.507 "buf_count": 2048 00:04:21.507 } 00:04:21.507 } 00:04:21.507 ] 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "subsystem": "bdev", 00:04:21.507 "config": [ 00:04:21.507 { 00:04:21.507 "method": "bdev_set_options", 00:04:21.507 "params": { 00:04:21.507 "bdev_io_pool_size": 65535, 00:04:21.507 "bdev_io_cache_size": 256, 00:04:21.507 "bdev_auto_examine": true, 00:04:21.507 "iobuf_small_cache_size": 128, 00:04:21.507 "iobuf_large_cache_size": 16 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "method": "bdev_raid_set_options", 00:04:21.507 "params": { 00:04:21.507 "process_window_size_kb": 1024 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "method": "bdev_iscsi_set_options", 00:04:21.507 "params": { 00:04:21.507 "timeout_sec": 30 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "method": "bdev_nvme_set_options", 00:04:21.507 "params": { 00:04:21.507 "action_on_timeout": "none", 00:04:21.507 "timeout_us": 0, 00:04:21.507 "timeout_admin_us": 0, 00:04:21.507 "keep_alive_timeout_ms": 10000, 00:04:21.507 "arbitration_burst": 0, 00:04:21.507 "low_priority_weight": 0, 00:04:21.507 "medium_priority_weight": 0, 00:04:21.507 "high_priority_weight": 0, 00:04:21.507 "nvme_adminq_poll_period_us": 10000, 00:04:21.507 "nvme_ioq_poll_period_us": 0, 00:04:21.507 "io_queue_requests": 0, 00:04:21.507 "delay_cmd_submit": true, 00:04:21.507 "transport_retry_count": 4, 00:04:21.507 "bdev_retry_count": 3, 00:04:21.507 "transport_ack_timeout": 0, 00:04:21.507 "ctrlr_loss_timeout_sec": 0, 00:04:21.507 "reconnect_delay_sec": 0, 00:04:21.507 "fast_io_fail_timeout_sec": 0, 00:04:21.507 "disable_auto_failback": false, 00:04:21.507 "generate_uuids": false, 00:04:21.507 "transport_tos": 0, 00:04:21.507 "nvme_error_stat": false, 00:04:21.507 "rdma_srq_size": 0, 00:04:21.507 "io_path_stat": false, 00:04:21.507 "allow_accel_sequence": false, 00:04:21.507 "rdma_max_cq_size": 0, 00:04:21.507 "rdma_cm_event_timeout_ms": 0, 00:04:21.507 "dhchap_digests": [ 00:04:21.507 "sha256", 00:04:21.507 "sha384", 00:04:21.507 "sha512" 00:04:21.507 ], 00:04:21.507 "dhchap_dhgroups": [ 00:04:21.507 "null", 00:04:21.507 "ffdhe2048", 00:04:21.507 "ffdhe3072", 00:04:21.507 "ffdhe4096", 00:04:21.507 "ffdhe6144", 00:04:21.507 "ffdhe8192" 00:04:21.507 ] 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.507 "method": "bdev_nvme_set_hotplug", 00:04:21.507 "params": { 00:04:21.507 "period_us": 100000, 00:04:21.507 "enable": false 00:04:21.507 } 00:04:21.507 }, 00:04:21.507 { 00:04:21.508 "method": "bdev_wait_for_examine" 00:04:21.508 } 00:04:21.508 ] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "scsi", 00:04:21.508 "config": null 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "scheduler", 00:04:21.508 "config": [ 00:04:21.508 { 00:04:21.508 "method": "framework_set_scheduler", 00:04:21.508 "params": { 00:04:21.508 "name": "static" 00:04:21.508 } 00:04:21.508 } 00:04:21.508 ] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "vhost_scsi", 00:04:21.508 "config": [] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "vhost_blk", 00:04:21.508 "config": [] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "ublk", 00:04:21.508 "config": [] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "nbd", 00:04:21.508 "config": [] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "nvmf", 00:04:21.508 "config": [ 00:04:21.508 { 00:04:21.508 "method": "nvmf_set_config", 00:04:21.508 "params": { 00:04:21.508 "discovery_filter": "match_any", 00:04:21.508 "admin_cmd_passthru": { 00:04:21.508 "identify_ctrlr": false 00:04:21.508 } 00:04:21.508 } 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "method": "nvmf_set_max_subsystems", 00:04:21.508 "params": { 00:04:21.508 "max_subsystems": 1024 00:04:21.508 } 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "method": "nvmf_set_crdt", 00:04:21.508 "params": { 00:04:21.508 "crdt1": 0, 00:04:21.508 "crdt2": 0, 00:04:21.508 "crdt3": 0 00:04:21.508 } 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "method": "nvmf_create_transport", 00:04:21.508 "params": { 00:04:21.508 "trtype": "TCP", 00:04:21.508 "max_queue_depth": 128, 00:04:21.508 "max_io_qpairs_per_ctrlr": 127, 00:04:21.508 "in_capsule_data_size": 4096, 00:04:21.508 "max_io_size": 131072, 00:04:21.508 "io_unit_size": 131072, 00:04:21.508 "max_aq_depth": 128, 00:04:21.508 "num_shared_buffers": 511, 00:04:21.508 "buf_cache_size": 4294967295, 00:04:21.508 "dif_insert_or_strip": false, 00:04:21.508 "zcopy": false, 00:04:21.508 "c2h_success": true, 00:04:21.508 "sock_priority": 0, 00:04:21.508 "abort_timeout_sec": 1, 00:04:21.508 "ack_timeout": 0, 00:04:21.508 "data_wr_pool_size": 0 00:04:21.508 } 00:04:21.508 } 00:04:21.508 ] 00:04:21.508 }, 00:04:21.508 { 00:04:21.508 "subsystem": "iscsi", 00:04:21.508 "config": [ 00:04:21.508 { 00:04:21.508 "method": "iscsi_set_options", 00:04:21.508 "params": { 00:04:21.508 "node_base": "iqn.2016-06.io.spdk", 00:04:21.508 "max_sessions": 128, 00:04:21.508 "max_connections_per_session": 2, 00:04:21.508 "max_queue_depth": 64, 00:04:21.508 "default_time2wait": 2, 00:04:21.508 "default_time2retain": 20, 00:04:21.508 "first_burst_length": 8192, 00:04:21.508 "immediate_data": true, 00:04:21.508 "allow_duplicated_isid": false, 00:04:21.508 "error_recovery_level": 0, 00:04:21.508 "nop_timeout": 60, 00:04:21.508 "nop_in_interval": 30, 00:04:21.508 "disable_chap": false, 00:04:21.508 "require_chap": false, 00:04:21.508 "mutual_chap": false, 00:04:21.508 "chap_group": 0, 00:04:21.508 "max_large_datain_per_connection": 64, 00:04:21.508 "max_r2t_per_connection": 4, 00:04:21.508 "pdu_pool_size": 36864, 00:04:21.508 "immediate_data_pool_size": 16384, 00:04:21.508 "data_out_pool_size": 2048 00:04:21.508 } 00:04:21.508 } 00:04:21.508 ] 00:04:21.508 } 00:04:21.508 ] 00:04:21.508 } 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 3789777 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 3789777 ']' 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 3789777 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3789777 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3789777' 00:04:21.508 killing process with pid 3789777 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 3789777 00:04:21.508 04:05:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 3789777 00:04:21.791 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=3790040 00:04:21.791 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:04:21.791 04:05:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 3790040 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@946 -- # '[' -z 3790040 ']' 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # kill -0 3790040 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # uname 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3790040 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3790040' 00:04:27.051 killing process with pid 3790040 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@965 -- # kill 3790040 00:04:27.051 04:05:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@970 -- # wait 3790040 00:04:27.308 04:05:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:04:27.308 04:05:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:04:27.308 00:04:27.308 real 0m7.148s 00:04:27.308 user 0m6.874s 00:04:27.308 sys 0m0.758s 00:04:27.308 04:05:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:27.308 04:05:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:04:27.308 ************************************ 00:04:27.308 END TEST skip_rpc_with_json 00:04:27.308 ************************************ 00:04:27.566 04:05:15 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:04:27.566 04:05:15 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:27.566 04:05:15 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:27.566 04:05:15 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:27.566 ************************************ 00:04:27.566 START TEST skip_rpc_with_delay 00:04:27.566 ************************************ 00:04:27.566 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_delay 00:04:27.566 04:05:15 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:27.566 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:04:27.566 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:04:27.567 [2024-05-15 04:05:15.424491] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:04:27.567 [2024-05-15 04:05:15.424607] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:04:27.567 00:04:27.567 real 0m0.077s 00:04:27.567 user 0m0.055s 00:04:27.567 sys 0m0.021s 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:27.567 04:05:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:04:27.567 ************************************ 00:04:27.567 END TEST skip_rpc_with_delay 00:04:27.567 ************************************ 00:04:27.567 04:05:15 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:04:27.567 04:05:15 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:04:27.567 04:05:15 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:04:27.567 04:05:15 skip_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:27.567 04:05:15 skip_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:27.567 04:05:15 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:27.567 ************************************ 00:04:27.567 START TEST exit_on_failed_rpc_init 00:04:27.567 ************************************ 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1121 -- # test_exit_on_failed_rpc_init 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=3790759 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 3790759 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@827 -- # '[' -z 3790759 ']' 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@832 -- # local max_retries=100 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:27.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # xtrace_disable 00:04:27.567 04:05:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:04:27.567 [2024-05-15 04:05:15.554757] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:27.567 [2024-05-15 04:05:15.554861] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3790759 ] 00:04:27.825 [2024-05-15 04:05:15.633497] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:27.825 [2024-05-15 04:05:15.740285] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # return 0 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:04:28.760 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:04:28.760 [2024-05-15 04:05:16.548337] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:28.760 [2024-05-15 04:05:16.548424] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3790825 ] 00:04:28.760 [2024-05-15 04:05:16.629223] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:28.760 [2024-05-15 04:05:16.752575] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:28.760 [2024-05-15 04:05:16.752720] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:04:28.760 [2024-05-15 04:05:16.752739] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:04:28.760 [2024-05-15 04:05:16.752751] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 3790759 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@946 -- # '[' -z 3790759 ']' 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # kill -0 3790759 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # uname 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3790759 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3790759' 00:04:29.018 killing process with pid 3790759 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@965 -- # kill 3790759 00:04:29.018 04:05:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@970 -- # wait 3790759 00:04:29.584 00:04:29.584 real 0m1.858s 00:04:29.584 user 0m2.221s 00:04:29.584 sys 0m0.510s 00:04:29.584 04:05:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:29.584 04:05:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:04:29.584 ************************************ 00:04:29.584 END TEST exit_on_failed_rpc_init 00:04:29.584 ************************************ 00:04:29.584 04:05:17 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:04:29.584 00:04:29.584 real 0m14.854s 00:04:29.584 user 0m14.414s 00:04:29.584 sys 0m1.798s 00:04:29.584 04:05:17 skip_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:29.584 04:05:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:29.584 ************************************ 00:04:29.584 END TEST skip_rpc 00:04:29.584 ************************************ 00:04:29.584 04:05:17 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:29.584 04:05:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:29.584 04:05:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:29.584 04:05:17 -- common/autotest_common.sh@10 -- # set +x 00:04:29.584 ************************************ 00:04:29.584 START TEST rpc_client 00:04:29.584 ************************************ 00:04:29.584 04:05:17 rpc_client -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:29.584 * Looking for test storage... 00:04:29.584 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:04:29.584 04:05:17 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:04:29.584 OK 00:04:29.584 04:05:17 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:04:29.584 00:04:29.584 real 0m0.076s 00:04:29.584 user 0m0.032s 00:04:29.584 sys 0m0.048s 00:04:29.584 04:05:17 rpc_client -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:29.584 04:05:17 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:04:29.584 ************************************ 00:04:29.584 END TEST rpc_client 00:04:29.584 ************************************ 00:04:29.584 04:05:17 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:04:29.584 04:05:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:29.584 04:05:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:29.584 04:05:17 -- common/autotest_common.sh@10 -- # set +x 00:04:29.585 ************************************ 00:04:29.585 START TEST json_config 00:04:29.585 ************************************ 00:04:29.585 04:05:17 json_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:04:29.585 04:05:17 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@7 -- # uname -s 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:29.585 04:05:17 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:04:29.843 04:05:17 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:29.843 04:05:17 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:29.843 04:05:17 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:29.843 04:05:17 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:29.843 04:05:17 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:29.843 04:05:17 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:29.843 04:05:17 json_config -- paths/export.sh@5 -- # export PATH 00:04:29.843 04:05:17 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@47 -- # : 0 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:04:29.843 04:05:17 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:04:29.843 04:05:17 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:04:29.844 INFO: JSON configuration test init 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:29.844 04:05:17 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:04:29.844 04:05:17 json_config -- json_config/common.sh@9 -- # local app=target 00:04:29.844 04:05:17 json_config -- json_config/common.sh@10 -- # shift 00:04:29.844 04:05:17 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:04:29.844 04:05:17 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:04:29.844 04:05:17 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:04:29.844 04:05:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:04:29.844 04:05:17 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:04:29.844 04:05:17 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3791066 00:04:29.844 04:05:17 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:04:29.844 Waiting for target to run... 00:04:29.844 04:05:17 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:04:29.844 04:05:17 json_config -- json_config/common.sh@25 -- # waitforlisten 3791066 /var/tmp/spdk_tgt.sock 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@827 -- # '[' -z 3791066 ']' 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:04:29.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:04:29.844 04:05:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:29.844 [2024-05-15 04:05:17.674624] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:29.844 [2024-05-15 04:05:17.674725] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3791066 ] 00:04:30.102 [2024-05-15 04:05:18.048645] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:30.359 [2024-05-15 04:05:18.137307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:30.617 04:05:18 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:04:30.617 04:05:18 json_config -- common/autotest_common.sh@860 -- # return 0 00:04:30.617 04:05:18 json_config -- json_config/common.sh@26 -- # echo '' 00:04:30.617 00:04:30.617 04:05:18 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:04:30.617 04:05:18 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:04:30.617 04:05:18 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:30.617 04:05:18 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:30.617 04:05:18 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:04:30.617 04:05:18 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:04:30.617 04:05:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:04:30.875 04:05:18 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:04:30.875 04:05:18 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:04:31.133 [2024-05-15 04:05:19.068132] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:04:31.133 04:05:19 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:04:31.133 04:05:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:04:31.390 [2024-05-15 04:05:19.320772] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:04:31.390 04:05:19 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:04:31.390 04:05:19 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:31.390 04:05:19 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:31.390 04:05:19 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:04:31.390 04:05:19 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:04:31.390 04:05:19 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:04:31.648 [2024-05-15 04:05:19.605791] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:04:36.934 04:05:24 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:36.934 04:05:24 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:04:36.934 04:05:24 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:04:36.934 04:05:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@48 -- # local get_types 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:04:37.190 04:05:25 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:37.190 04:05:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@55 -- # return 0 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:04:37.190 04:05:25 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:37.190 04:05:25 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:04:37.190 04:05:25 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:04:37.190 04:05:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:04:37.448 04:05:25 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:04:37.448 04:05:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:04:37.704 Nvme0n1p0 Nvme0n1p1 00:04:37.704 04:05:25 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:04:37.704 04:05:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:04:37.962 [2024-05-15 04:05:25.836794] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:04:37.962 [2024-05-15 04:05:25.836883] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:04:37.962 00:04:37.962 04:05:25 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:04:37.962 04:05:25 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:04:38.219 Malloc3 00:04:38.219 04:05:26 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:04:38.219 04:05:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:04:38.476 [2024-05-15 04:05:26.326180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:04:38.476 [2024-05-15 04:05:26.326249] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:38.476 [2024-05-15 04:05:26.326280] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d1a1b0 00:04:38.476 [2024-05-15 04:05:26.326295] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:38.476 [2024-05-15 04:05:26.327921] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:38.476 [2024-05-15 04:05:26.327945] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:04:38.476 PTBdevFromMalloc3 00:04:38.476 04:05:26 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:04:38.476 04:05:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:04:38.734 Null0 00:04:38.734 04:05:26 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:04:38.734 04:05:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:04:38.991 Malloc0 00:04:38.991 04:05:26 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:04:38.991 04:05:26 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:04:39.249 Malloc1 00:04:39.249 04:05:27 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:04:39.249 04:05:27 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:04:39.506 102400+0 records in 00:04:39.506 102400+0 records out 00:04:39.506 104857600 bytes (105 MB, 100 MiB) copied, 0.179601 s, 584 MB/s 00:04:39.506 04:05:27 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:04:39.506 04:05:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:04:39.506 aio_disk 00:04:39.763 04:05:27 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:04:39.763 04:05:27 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:04:39.763 04:05:27 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:04:43.943 0a832283-0837-4e5a-9dd4-d61b71f08458 00:04:43.943 04:05:31 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:04:43.943 04:05:31 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:04:43.943 04:05:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:04:43.943 04:05:31 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:04:43.943 04:05:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:04:44.200 04:05:32 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:04:44.200 04:05:32 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:04:44.457 04:05:32 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:04:44.457 04:05:32 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:04:44.715 04:05:32 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:04:44.715 04:05:32 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:04:44.715 04:05:32 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:04:44.973 MallocForCryptoBdev 00:04:44.973 04:05:32 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:04:44.973 04:05:32 json_config -- json_config/json_config.sh@159 -- # wc -l 00:04:44.973 04:05:32 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:04:44.973 04:05:32 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:04:44.973 04:05:32 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:04:44.973 04:05:32 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:04:45.231 [2024-05-15 04:05:33.127765] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:04:45.231 CryptoMallocBdev 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@71 -- # sort 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@72 -- # sort 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:04:45.231 04:05:33 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:04:45.231 04:05:33 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.489 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\d\6\7\a\8\8\b\-\1\3\3\0\-\4\7\1\6\-\b\f\d\8\-\d\c\1\f\b\d\d\9\7\5\f\4\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\e\2\e\9\7\2\b\-\e\2\a\4\-\4\9\f\9\-\9\5\7\a\-\f\e\3\c\a\c\8\2\9\a\3\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\7\f\1\d\c\b\4\-\d\2\b\0\-\4\8\c\a\-\9\c\e\7\-\9\7\a\f\4\1\8\2\b\f\2\f\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\f\6\7\6\3\5\e\-\b\8\2\b\-\4\e\d\f\-\9\6\8\6\-\8\1\8\0\4\e\b\a\4\a\0\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@86 -- # cat 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:04:45.490 Expected events matched: 00:04:45.490 bdev_register:0d67a88b-1330-4716-bfd8-dc1fbdd975f4 00:04:45.490 bdev_register:4e2e972b-e2a4-49f9-957a-fe3cac829a3b 00:04:45.490 bdev_register:87f1dcb4-d2b0-48ca-9ce7-97af4182bf2f 00:04:45.490 bdev_register:af67635e-b82b-4edf-9686-81804eba4a08 00:04:45.490 bdev_register:aio_disk 00:04:45.490 bdev_register:CryptoMallocBdev 00:04:45.490 bdev_register:Malloc0 00:04:45.490 bdev_register:Malloc0p0 00:04:45.490 bdev_register:Malloc0p1 00:04:45.490 bdev_register:Malloc0p2 00:04:45.490 bdev_register:Malloc1 00:04:45.490 bdev_register:Malloc3 00:04:45.490 bdev_register:MallocForCryptoBdev 00:04:45.490 bdev_register:Null0 00:04:45.490 bdev_register:Nvme0n1 00:04:45.490 bdev_register:Nvme0n1p0 00:04:45.490 bdev_register:Nvme0n1p1 00:04:45.490 bdev_register:PTBdevFromMalloc3 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:04:45.490 04:05:33 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:45.490 04:05:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:04:45.490 04:05:33 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:45.490 04:05:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:04:45.490 04:05:33 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:04:45.490 04:05:33 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:04:45.748 MallocBdevForConfigChangeCheck 00:04:45.748 04:05:33 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:04:45.748 04:05:33 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:45.748 04:05:33 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:45.748 04:05:33 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:04:45.748 04:05:33 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:04:46.313 04:05:34 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:04:46.313 INFO: shutting down applications... 00:04:46.313 04:05:34 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:04:46.313 04:05:34 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:04:46.313 04:05:34 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:04:46.313 04:05:34 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:04:46.313 [2024-05-15 04:05:34.303389] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:04:49.602 Calling clear_iscsi_subsystem 00:04:49.602 Calling clear_nvmf_subsystem 00:04:49.602 Calling clear_nbd_subsystem 00:04:49.602 Calling clear_ublk_subsystem 00:04:49.602 Calling clear_vhost_blk_subsystem 00:04:49.602 Calling clear_vhost_scsi_subsystem 00:04:49.602 Calling clear_bdev_subsystem 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@343 -- # count=100 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:04:49.602 04:05:36 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:04:49.602 04:05:37 json_config -- json_config/json_config.sh@345 -- # break 00:04:49.602 04:05:37 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:04:49.602 04:05:37 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:04:49.602 04:05:37 json_config -- json_config/common.sh@31 -- # local app=target 00:04:49.602 04:05:37 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:04:49.603 04:05:37 json_config -- json_config/common.sh@35 -- # [[ -n 3791066 ]] 00:04:49.603 04:05:37 json_config -- json_config/common.sh@38 -- # kill -SIGINT 3791066 00:04:49.603 04:05:37 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:04:49.603 04:05:37 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:04:49.603 04:05:37 json_config -- json_config/common.sh@41 -- # kill -0 3791066 00:04:49.603 04:05:37 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:04:49.861 04:05:37 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:04:49.861 04:05:37 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:04:49.861 04:05:37 json_config -- json_config/common.sh@41 -- # kill -0 3791066 00:04:49.861 04:05:37 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:04:49.861 04:05:37 json_config -- json_config/common.sh@43 -- # break 00:04:49.861 04:05:37 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:04:49.861 04:05:37 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:04:49.861 SPDK target shutdown done 00:04:49.861 04:05:37 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:04:49.861 INFO: relaunching applications... 00:04:49.861 04:05:37 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:49.861 04:05:37 json_config -- json_config/common.sh@9 -- # local app=target 00:04:49.861 04:05:37 json_config -- json_config/common.sh@10 -- # shift 00:04:49.861 04:05:37 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:04:49.861 04:05:37 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:04:49.861 04:05:37 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:04:49.861 04:05:37 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:04:49.861 04:05:37 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:04:49.861 04:05:37 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=3793574 00:04:49.861 04:05:37 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:49.861 04:05:37 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:04:49.861 Waiting for target to run... 00:04:49.861 04:05:37 json_config -- json_config/common.sh@25 -- # waitforlisten 3793574 /var/tmp/spdk_tgt.sock 00:04:49.861 04:05:37 json_config -- common/autotest_common.sh@827 -- # '[' -z 3793574 ']' 00:04:49.861 04:05:37 json_config -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:04:49.861 04:05:37 json_config -- common/autotest_common.sh@832 -- # local max_retries=100 00:04:49.862 04:05:37 json_config -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:04:49.862 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:04:49.862 04:05:37 json_config -- common/autotest_common.sh@836 -- # xtrace_disable 00:04:49.862 04:05:37 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:49.862 [2024-05-15 04:05:37.832696] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:04:49.862 [2024-05-15 04:05:37.832782] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3793574 ] 00:04:50.428 [2024-05-15 04:05:38.361578] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:50.685 [2024-05-15 04:05:38.468932] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:50.685 [2024-05-15 04:05:38.514983] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:04:50.685 [2024-05-15 04:05:38.523021] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:04:50.685 [2024-05-15 04:05:38.531039] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:04:50.685 [2024-05-15 04:05:38.611817] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:04:53.214 [2024-05-15 04:05:40.994172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:04:53.214 [2024-05-15 04:05:40.994250] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:04:53.214 [2024-05-15 04:05:40.994277] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:04:53.214 [2024-05-15 04:05:41.002192] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:04:53.214 [2024-05-15 04:05:41.002227] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:04:53.214 [2024-05-15 04:05:41.010201] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:04:53.214 [2024-05-15 04:05:41.010233] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:04:53.214 [2024-05-15 04:05:41.018235] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:04:53.214 [2024-05-15 04:05:41.018277] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:04:53.214 [2024-05-15 04:05:41.018295] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:04:56.568 [2024-05-15 04:05:43.897838] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:04:56.568 [2024-05-15 04:05:43.897893] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:56.568 [2024-05-15 04:05:43.897915] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc3cda0 00:04:56.568 [2024-05-15 04:05:43.897933] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:56.568 [2024-05-15 04:05:43.898195] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:56.568 [2024-05-15 04:05:43.898215] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:04:56.568 04:05:44 json_config -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:04:56.568 04:05:44 json_config -- common/autotest_common.sh@860 -- # return 0 00:04:56.568 04:05:44 json_config -- json_config/common.sh@26 -- # echo '' 00:04:56.568 00:04:56.568 04:05:44 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:04:56.568 04:05:44 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:04:56.568 INFO: Checking if target configuration is the same... 00:04:56.568 04:05:44 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:56.568 04:05:44 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:04:56.568 04:05:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:04:56.568 + '[' 2 -ne 2 ']' 00:04:56.568 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:04:56.568 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:04:56.568 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:56.568 +++ basename /dev/fd/62 00:04:56.568 ++ mktemp /tmp/62.XXX 00:04:56.568 + tmp_file_1=/tmp/62.idO 00:04:56.568 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:56.568 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:04:56.568 + tmp_file_2=/tmp/spdk_tgt_config.json.Dbx 00:04:56.568 + ret=0 00:04:56.568 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:04:56.568 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:04:56.568 + diff -u /tmp/62.idO /tmp/spdk_tgt_config.json.Dbx 00:04:56.568 + echo 'INFO: JSON config files are the same' 00:04:56.568 INFO: JSON config files are the same 00:04:56.568 + rm /tmp/62.idO /tmp/spdk_tgt_config.json.Dbx 00:04:56.569 + exit 0 00:04:56.569 04:05:44 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:04:56.569 04:05:44 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:04:56.569 INFO: changing configuration and checking if this can be detected... 00:04:56.569 04:05:44 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:04:56.569 04:05:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:04:56.828 04:05:44 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:56.828 04:05:44 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:04:56.828 04:05:44 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:04:56.828 + '[' 2 -ne 2 ']' 00:04:56.828 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:04:56.828 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:04:56.828 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:56.828 +++ basename /dev/fd/62 00:04:56.828 ++ mktemp /tmp/62.XXX 00:04:56.828 + tmp_file_1=/tmp/62.rVn 00:04:56.828 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:04:56.828 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:04:56.828 + tmp_file_2=/tmp/spdk_tgt_config.json.HA3 00:04:56.828 + ret=0 00:04:56.828 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:04:57.086 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:04:57.347 + diff -u /tmp/62.rVn /tmp/spdk_tgt_config.json.HA3 00:04:57.347 + ret=1 00:04:57.347 + echo '=== Start of file: /tmp/62.rVn ===' 00:04:57.347 + cat /tmp/62.rVn 00:04:57.347 + echo '=== End of file: /tmp/62.rVn ===' 00:04:57.347 + echo '' 00:04:57.347 + echo '=== Start of file: /tmp/spdk_tgt_config.json.HA3 ===' 00:04:57.347 + cat /tmp/spdk_tgt_config.json.HA3 00:04:57.347 + echo '=== End of file: /tmp/spdk_tgt_config.json.HA3 ===' 00:04:57.347 + echo '' 00:04:57.347 + rm /tmp/62.rVn /tmp/spdk_tgt_config.json.HA3 00:04:57.347 + exit 1 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:04:57.347 INFO: configuration change detected. 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:04:57.347 04:05:45 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:57.347 04:05:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@317 -- # [[ -n 3793574 ]] 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:04:57.347 04:05:45 json_config -- common/autotest_common.sh@720 -- # xtrace_disable 00:04:57.347 04:05:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:04:57.347 04:05:45 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:04:57.347 04:05:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:04:57.606 04:05:45 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:04:57.606 04:05:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:04:57.606 04:05:45 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:04:57.606 04:05:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:04:57.865 04:05:45 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:04:57.865 04:05:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@193 -- # uname -s 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@10 -- # set +x 00:04:58.123 04:05:46 json_config -- json_config/json_config.sh@323 -- # killprocess 3793574 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@946 -- # '[' -z 3793574 ']' 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@950 -- # kill -0 3793574 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@951 -- # uname 00:04:58.123 04:05:46 json_config -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3793574 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3793574' 00:04:58.382 killing process with pid 3793574 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@965 -- # kill 3793574 00:04:58.382 04:05:46 json_config -- common/autotest_common.sh@970 -- # wait 3793574 00:05:01.672 04:05:48 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:01.672 04:05:48 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:05:01.672 04:05:48 json_config -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:01.672 04:05:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:01.672 04:05:49 json_config -- json_config/json_config.sh@328 -- # return 0 00:05:01.672 04:05:49 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:05:01.672 INFO: Success 00:05:01.672 00:05:01.672 real 0m31.460s 00:05:01.672 user 0m36.772s 00:05:01.672 sys 0m3.368s 00:05:01.672 04:05:49 json_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:01.672 04:05:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:01.672 ************************************ 00:05:01.672 END TEST json_config 00:05:01.672 ************************************ 00:05:01.672 04:05:49 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:01.672 04:05:49 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:01.672 04:05:49 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:01.672 04:05:49 -- common/autotest_common.sh@10 -- # set +x 00:05:01.672 ************************************ 00:05:01.672 START TEST json_config_extra_key 00:05:01.672 ************************************ 00:05:01.672 04:05:49 json_config_extra_key -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:01.672 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:01.672 04:05:49 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:01.672 04:05:49 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:01.672 04:05:49 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:01.672 04:05:49 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:01.672 04:05:49 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:01.673 04:05:49 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:01.673 04:05:49 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:01.673 04:05:49 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:05:01.673 04:05:49 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:01.673 04:05:49 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:01.673 INFO: launching applications... 00:05:01.673 04:05:49 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=3795026 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:01.673 Waiting for target to run... 00:05:01.673 04:05:49 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 3795026 /var/tmp/spdk_tgt.sock 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@827 -- # '[' -z 3795026 ']' 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:01.673 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:01.673 04:05:49 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:01.673 [2024-05-15 04:05:49.179877] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:01.673 [2024-05-15 04:05:49.179956] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3795026 ] 00:05:01.931 [2024-05-15 04:05:49.718753] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:01.931 [2024-05-15 04:05:49.826534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:02.191 04:05:50 json_config_extra_key -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:02.192 04:05:50 json_config_extra_key -- common/autotest_common.sh@860 -- # return 0 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:05:02.192 00:05:02.192 04:05:50 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:02.192 INFO: shutting down applications... 00:05:02.192 04:05:50 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 3795026 ]] 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 3795026 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3795026 00:05:02.192 04:05:50 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3795026 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@43 -- # break 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:02.761 04:05:50 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:02.761 SPDK target shutdown done 00:05:02.761 04:05:50 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:02.761 Success 00:05:02.761 00:05:02.761 real 0m1.562s 00:05:02.761 user 0m1.177s 00:05:02.761 sys 0m0.633s 00:05:02.761 04:05:50 json_config_extra_key -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:02.761 04:05:50 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:02.761 ************************************ 00:05:02.761 END TEST json_config_extra_key 00:05:02.761 ************************************ 00:05:02.761 04:05:50 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:02.761 04:05:50 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:02.761 04:05:50 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:02.761 04:05:50 -- common/autotest_common.sh@10 -- # set +x 00:05:02.761 ************************************ 00:05:02.761 START TEST alias_rpc 00:05:02.761 ************************************ 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:02.761 * Looking for test storage... 00:05:02.761 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:05:02.761 04:05:50 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:02.761 04:05:50 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3795333 00:05:02.761 04:05:50 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:02.761 04:05:50 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3795333 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@827 -- # '[' -z 3795333 ']' 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:02.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:02.761 04:05:50 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:03.022 [2024-05-15 04:05:50.794354] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:03.022 [2024-05-15 04:05:50.794426] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3795333 ] 00:05:03.022 [2024-05-15 04:05:50.868863] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:03.022 [2024-05-15 04:05:50.980199] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:03.960 04:05:51 alias_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:03.960 04:05:51 alias_rpc -- common/autotest_common.sh@860 -- # return 0 00:05:03.960 04:05:51 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:04.220 04:05:51 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3795333 00:05:04.220 04:05:51 alias_rpc -- common/autotest_common.sh@946 -- # '[' -z 3795333 ']' 00:05:04.220 04:05:51 alias_rpc -- common/autotest_common.sh@950 -- # kill -0 3795333 00:05:04.220 04:05:51 alias_rpc -- common/autotest_common.sh@951 -- # uname 00:05:04.220 04:05:51 alias_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3795333 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3795333' 00:05:04.220 killing process with pid 3795333 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@965 -- # kill 3795333 00:05:04.220 04:05:52 alias_rpc -- common/autotest_common.sh@970 -- # wait 3795333 00:05:04.787 00:05:04.787 real 0m1.820s 00:05:04.787 user 0m2.074s 00:05:04.787 sys 0m0.469s 00:05:04.788 04:05:52 alias_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:04.788 04:05:52 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:04.788 ************************************ 00:05:04.788 END TEST alias_rpc 00:05:04.788 ************************************ 00:05:04.788 04:05:52 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:05:04.788 04:05:52 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:04.788 04:05:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:04.788 04:05:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:04.788 04:05:52 -- common/autotest_common.sh@10 -- # set +x 00:05:04.788 ************************************ 00:05:04.788 START TEST spdkcli_tcp 00:05:04.788 ************************************ 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:04.788 * Looking for test storage... 00:05:04.788 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3795600 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:04.788 04:05:52 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 3795600 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@827 -- # '[' -z 3795600 ']' 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:04.788 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:04.788 04:05:52 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:04.788 [2024-05-15 04:05:52.667797] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:04.788 [2024-05-15 04:05:52.667896] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3795600 ] 00:05:04.788 [2024-05-15 04:05:52.743301] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:05.048 [2024-05-15 04:05:52.853426] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:05.048 [2024-05-15 04:05:52.853431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:05.617 04:05:53 spdkcli_tcp -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:05.617 04:05:53 spdkcli_tcp -- common/autotest_common.sh@860 -- # return 0 00:05:05.617 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=3795665 00:05:05.617 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:05.617 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:05.875 [ 00:05:05.875 "bdev_malloc_delete", 00:05:05.875 "bdev_malloc_create", 00:05:05.875 "bdev_null_resize", 00:05:05.875 "bdev_null_delete", 00:05:05.875 "bdev_null_create", 00:05:05.875 "bdev_nvme_cuse_unregister", 00:05:05.875 "bdev_nvme_cuse_register", 00:05:05.875 "bdev_opal_new_user", 00:05:05.875 "bdev_opal_set_lock_state", 00:05:05.875 "bdev_opal_delete", 00:05:05.875 "bdev_opal_get_info", 00:05:05.875 "bdev_opal_create", 00:05:05.875 "bdev_nvme_opal_revert", 00:05:05.875 "bdev_nvme_opal_init", 00:05:05.875 "bdev_nvme_send_cmd", 00:05:05.875 "bdev_nvme_get_path_iostat", 00:05:05.875 "bdev_nvme_get_mdns_discovery_info", 00:05:05.875 "bdev_nvme_stop_mdns_discovery", 00:05:05.875 "bdev_nvme_start_mdns_discovery", 00:05:05.875 "bdev_nvme_set_multipath_policy", 00:05:05.875 "bdev_nvme_set_preferred_path", 00:05:05.875 "bdev_nvme_get_io_paths", 00:05:05.875 "bdev_nvme_remove_error_injection", 00:05:05.875 "bdev_nvme_add_error_injection", 00:05:05.875 "bdev_nvme_get_discovery_info", 00:05:05.875 "bdev_nvme_stop_discovery", 00:05:05.875 "bdev_nvme_start_discovery", 00:05:05.876 "bdev_nvme_get_controller_health_info", 00:05:05.876 "bdev_nvme_disable_controller", 00:05:05.876 "bdev_nvme_enable_controller", 00:05:05.876 "bdev_nvme_reset_controller", 00:05:05.876 "bdev_nvme_get_transport_statistics", 00:05:05.876 "bdev_nvme_apply_firmware", 00:05:05.876 "bdev_nvme_detach_controller", 00:05:05.876 "bdev_nvme_get_controllers", 00:05:05.876 "bdev_nvme_attach_controller", 00:05:05.876 "bdev_nvme_set_hotplug", 00:05:05.876 "bdev_nvme_set_options", 00:05:05.876 "bdev_passthru_delete", 00:05:05.876 "bdev_passthru_create", 00:05:05.876 "bdev_lvol_check_shallow_copy", 00:05:05.876 "bdev_lvol_start_shallow_copy", 00:05:05.876 "bdev_lvol_grow_lvstore", 00:05:05.876 "bdev_lvol_get_lvols", 00:05:05.876 "bdev_lvol_get_lvstores", 00:05:05.876 "bdev_lvol_delete", 00:05:05.876 "bdev_lvol_set_read_only", 00:05:05.876 "bdev_lvol_resize", 00:05:05.876 "bdev_lvol_decouple_parent", 00:05:05.876 "bdev_lvol_inflate", 00:05:05.876 "bdev_lvol_rename", 00:05:05.876 "bdev_lvol_clone_bdev", 00:05:05.876 "bdev_lvol_clone", 00:05:05.876 "bdev_lvol_snapshot", 00:05:05.876 "bdev_lvol_create", 00:05:05.876 "bdev_lvol_delete_lvstore", 00:05:05.876 "bdev_lvol_rename_lvstore", 00:05:05.876 "bdev_lvol_create_lvstore", 00:05:05.876 "bdev_raid_set_options", 00:05:05.876 "bdev_raid_remove_base_bdev", 00:05:05.876 "bdev_raid_add_base_bdev", 00:05:05.876 "bdev_raid_delete", 00:05:05.876 "bdev_raid_create", 00:05:05.876 "bdev_raid_get_bdevs", 00:05:05.876 "bdev_error_inject_error", 00:05:05.876 "bdev_error_delete", 00:05:05.876 "bdev_error_create", 00:05:05.876 "bdev_split_delete", 00:05:05.876 "bdev_split_create", 00:05:05.876 "bdev_delay_delete", 00:05:05.876 "bdev_delay_create", 00:05:05.876 "bdev_delay_update_latency", 00:05:05.876 "bdev_zone_block_delete", 00:05:05.876 "bdev_zone_block_create", 00:05:05.876 "blobfs_create", 00:05:05.876 "blobfs_detect", 00:05:05.876 "blobfs_set_cache_size", 00:05:05.876 "bdev_crypto_delete", 00:05:05.876 "bdev_crypto_create", 00:05:05.876 "bdev_compress_delete", 00:05:05.876 "bdev_compress_create", 00:05:05.876 "bdev_compress_get_orphans", 00:05:05.876 "bdev_aio_delete", 00:05:05.876 "bdev_aio_rescan", 00:05:05.876 "bdev_aio_create", 00:05:05.876 "bdev_ftl_set_property", 00:05:05.876 "bdev_ftl_get_properties", 00:05:05.876 "bdev_ftl_get_stats", 00:05:05.876 "bdev_ftl_unmap", 00:05:05.876 "bdev_ftl_unload", 00:05:05.876 "bdev_ftl_delete", 00:05:05.876 "bdev_ftl_load", 00:05:05.876 "bdev_ftl_create", 00:05:05.876 "bdev_virtio_attach_controller", 00:05:05.876 "bdev_virtio_scsi_get_devices", 00:05:05.876 "bdev_virtio_detach_controller", 00:05:05.876 "bdev_virtio_blk_set_hotplug", 00:05:05.876 "bdev_iscsi_delete", 00:05:05.876 "bdev_iscsi_create", 00:05:05.876 "bdev_iscsi_set_options", 00:05:05.876 "accel_error_inject_error", 00:05:05.876 "ioat_scan_accel_module", 00:05:05.876 "dsa_scan_accel_module", 00:05:05.876 "iaa_scan_accel_module", 00:05:05.876 "dpdk_cryptodev_get_driver", 00:05:05.876 "dpdk_cryptodev_set_driver", 00:05:05.876 "dpdk_cryptodev_scan_accel_module", 00:05:05.876 "compressdev_scan_accel_module", 00:05:05.876 "keyring_file_remove_key", 00:05:05.876 "keyring_file_add_key", 00:05:05.876 "iscsi_get_histogram", 00:05:05.876 "iscsi_enable_histogram", 00:05:05.876 "iscsi_set_options", 00:05:05.876 "iscsi_get_auth_groups", 00:05:05.876 "iscsi_auth_group_remove_secret", 00:05:05.876 "iscsi_auth_group_add_secret", 00:05:05.876 "iscsi_delete_auth_group", 00:05:05.876 "iscsi_create_auth_group", 00:05:05.876 "iscsi_set_discovery_auth", 00:05:05.876 "iscsi_get_options", 00:05:05.876 "iscsi_target_node_request_logout", 00:05:05.876 "iscsi_target_node_set_redirect", 00:05:05.876 "iscsi_target_node_set_auth", 00:05:05.876 "iscsi_target_node_add_lun", 00:05:05.876 "iscsi_get_stats", 00:05:05.876 "iscsi_get_connections", 00:05:05.876 "iscsi_portal_group_set_auth", 00:05:05.876 "iscsi_start_portal_group", 00:05:05.876 "iscsi_delete_portal_group", 00:05:05.876 "iscsi_create_portal_group", 00:05:05.876 "iscsi_get_portal_groups", 00:05:05.876 "iscsi_delete_target_node", 00:05:05.876 "iscsi_target_node_remove_pg_ig_maps", 00:05:05.876 "iscsi_target_node_add_pg_ig_maps", 00:05:05.876 "iscsi_create_target_node", 00:05:05.876 "iscsi_get_target_nodes", 00:05:05.876 "iscsi_delete_initiator_group", 00:05:05.876 "iscsi_initiator_group_remove_initiators", 00:05:05.876 "iscsi_initiator_group_add_initiators", 00:05:05.876 "iscsi_create_initiator_group", 00:05:05.876 "iscsi_get_initiator_groups", 00:05:05.876 "nvmf_set_crdt", 00:05:05.876 "nvmf_set_config", 00:05:05.876 "nvmf_set_max_subsystems", 00:05:05.876 "nvmf_stop_mdns_prr", 00:05:05.876 "nvmf_publish_mdns_prr", 00:05:05.876 "nvmf_subsystem_get_listeners", 00:05:05.876 "nvmf_subsystem_get_qpairs", 00:05:05.876 "nvmf_subsystem_get_controllers", 00:05:05.876 "nvmf_get_stats", 00:05:05.876 "nvmf_get_transports", 00:05:05.876 "nvmf_create_transport", 00:05:05.876 "nvmf_get_targets", 00:05:05.876 "nvmf_delete_target", 00:05:05.876 "nvmf_create_target", 00:05:05.876 "nvmf_subsystem_allow_any_host", 00:05:05.876 "nvmf_subsystem_remove_host", 00:05:05.876 "nvmf_subsystem_add_host", 00:05:05.876 "nvmf_ns_remove_host", 00:05:05.876 "nvmf_ns_add_host", 00:05:05.876 "nvmf_subsystem_remove_ns", 00:05:05.876 "nvmf_subsystem_add_ns", 00:05:05.876 "nvmf_subsystem_listener_set_ana_state", 00:05:05.876 "nvmf_discovery_get_referrals", 00:05:05.876 "nvmf_discovery_remove_referral", 00:05:05.876 "nvmf_discovery_add_referral", 00:05:05.876 "nvmf_subsystem_remove_listener", 00:05:05.876 "nvmf_subsystem_add_listener", 00:05:05.876 "nvmf_delete_subsystem", 00:05:05.876 "nvmf_create_subsystem", 00:05:05.876 "nvmf_get_subsystems", 00:05:05.876 "env_dpdk_get_mem_stats", 00:05:05.876 "nbd_get_disks", 00:05:05.876 "nbd_stop_disk", 00:05:05.876 "nbd_start_disk", 00:05:05.876 "ublk_recover_disk", 00:05:05.876 "ublk_get_disks", 00:05:05.876 "ublk_stop_disk", 00:05:05.876 "ublk_start_disk", 00:05:05.876 "ublk_destroy_target", 00:05:05.876 "ublk_create_target", 00:05:05.876 "virtio_blk_create_transport", 00:05:05.876 "virtio_blk_get_transports", 00:05:05.876 "vhost_controller_set_coalescing", 00:05:05.876 "vhost_get_controllers", 00:05:05.876 "vhost_delete_controller", 00:05:05.876 "vhost_create_blk_controller", 00:05:05.876 "vhost_scsi_controller_remove_target", 00:05:05.876 "vhost_scsi_controller_add_target", 00:05:05.876 "vhost_start_scsi_controller", 00:05:05.876 "vhost_create_scsi_controller", 00:05:05.876 "thread_set_cpumask", 00:05:05.876 "framework_get_scheduler", 00:05:05.876 "framework_set_scheduler", 00:05:05.876 "framework_get_reactors", 00:05:05.876 "thread_get_io_channels", 00:05:05.876 "thread_get_pollers", 00:05:05.876 "thread_get_stats", 00:05:05.876 "framework_monitor_context_switch", 00:05:05.876 "spdk_kill_instance", 00:05:05.876 "log_enable_timestamps", 00:05:05.876 "log_get_flags", 00:05:05.876 "log_clear_flag", 00:05:05.876 "log_set_flag", 00:05:05.876 "log_get_level", 00:05:05.876 "log_set_level", 00:05:05.876 "log_get_print_level", 00:05:05.876 "log_set_print_level", 00:05:05.876 "framework_enable_cpumask_locks", 00:05:05.876 "framework_disable_cpumask_locks", 00:05:05.876 "framework_wait_init", 00:05:05.876 "framework_start_init", 00:05:05.876 "scsi_get_devices", 00:05:05.876 "bdev_get_histogram", 00:05:05.876 "bdev_enable_histogram", 00:05:05.876 "bdev_set_qos_limit", 00:05:05.876 "bdev_set_qd_sampling_period", 00:05:05.876 "bdev_get_bdevs", 00:05:05.876 "bdev_reset_iostat", 00:05:05.876 "bdev_get_iostat", 00:05:05.876 "bdev_examine", 00:05:05.876 "bdev_wait_for_examine", 00:05:05.876 "bdev_set_options", 00:05:05.876 "notify_get_notifications", 00:05:05.876 "notify_get_types", 00:05:05.876 "accel_get_stats", 00:05:05.876 "accel_set_options", 00:05:05.876 "accel_set_driver", 00:05:05.876 "accel_crypto_key_destroy", 00:05:05.876 "accel_crypto_keys_get", 00:05:05.876 "accel_crypto_key_create", 00:05:05.876 "accel_assign_opc", 00:05:05.876 "accel_get_module_info", 00:05:05.876 "accel_get_opc_assignments", 00:05:05.876 "vmd_rescan", 00:05:05.876 "vmd_remove_device", 00:05:05.876 "vmd_enable", 00:05:05.876 "sock_get_default_impl", 00:05:05.876 "sock_set_default_impl", 00:05:05.876 "sock_impl_set_options", 00:05:05.876 "sock_impl_get_options", 00:05:05.876 "iobuf_get_stats", 00:05:05.876 "iobuf_set_options", 00:05:05.876 "framework_get_pci_devices", 00:05:05.876 "framework_get_config", 00:05:05.876 "framework_get_subsystems", 00:05:05.876 "trace_get_info", 00:05:05.876 "trace_get_tpoint_group_mask", 00:05:05.876 "trace_disable_tpoint_group", 00:05:05.876 "trace_enable_tpoint_group", 00:05:05.876 "trace_clear_tpoint_mask", 00:05:05.876 "trace_set_tpoint_mask", 00:05:05.876 "keyring_get_keys", 00:05:05.876 "spdk_get_version", 00:05:05.876 "rpc_get_methods" 00:05:05.876 ] 00:05:05.876 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:05.876 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:05.876 04:05:53 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 3795600 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@946 -- # '[' -z 3795600 ']' 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@950 -- # kill -0 3795600 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@951 -- # uname 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:05.876 04:05:53 spdkcli_tcp -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3795600 00:05:06.136 04:05:53 spdkcli_tcp -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:06.136 04:05:53 spdkcli_tcp -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:06.136 04:05:53 spdkcli_tcp -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3795600' 00:05:06.136 killing process with pid 3795600 00:05:06.136 04:05:53 spdkcli_tcp -- common/autotest_common.sh@965 -- # kill 3795600 00:05:06.136 04:05:53 spdkcli_tcp -- common/autotest_common.sh@970 -- # wait 3795600 00:05:06.396 00:05:06.396 real 0m1.818s 00:05:06.396 user 0m3.439s 00:05:06.396 sys 0m0.501s 00:05:06.396 04:05:54 spdkcli_tcp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:06.396 04:05:54 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:06.396 ************************************ 00:05:06.396 END TEST spdkcli_tcp 00:05:06.396 ************************************ 00:05:06.396 04:05:54 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:06.396 04:05:54 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:06.397 04:05:54 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:06.397 04:05:54 -- common/autotest_common.sh@10 -- # set +x 00:05:06.655 ************************************ 00:05:06.655 START TEST dpdk_mem_utility 00:05:06.655 ************************************ 00:05:06.655 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:06.656 * Looking for test storage... 00:05:06.656 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:05:06.656 04:05:54 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:06.656 04:05:54 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3795858 00:05:06.656 04:05:54 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:06.656 04:05:54 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3795858 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@827 -- # '[' -z 3795858 ']' 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:06.656 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:06.656 04:05:54 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:06.656 [2024-05-15 04:05:54.533779] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:06.656 [2024-05-15 04:05:54.533870] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3795858 ] 00:05:06.656 [2024-05-15 04:05:54.609514] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:06.915 [2024-05-15 04:05:54.722006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:07.483 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:07.483 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@860 -- # return 0 00:05:07.483 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:07.483 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:07.483 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:07.483 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:07.483 { 00:05:07.483 "filename": "/tmp/spdk_mem_dump.txt" 00:05:07.483 } 00:05:07.483 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:07.483 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:07.748 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:07.748 1 heaps totaling size 814.000000 MiB 00:05:07.748 size: 814.000000 MiB heap id: 0 00:05:07.748 end heaps---------- 00:05:07.748 8 mempools totaling size 598.116089 MiB 00:05:07.748 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:07.748 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:07.748 size: 84.521057 MiB name: bdev_io_3795858 00:05:07.748 size: 51.011292 MiB name: evtpool_3795858 00:05:07.748 size: 50.003479 MiB name: msgpool_3795858 00:05:07.748 size: 21.763794 MiB name: PDU_Pool 00:05:07.748 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:07.748 size: 0.026123 MiB name: Session_Pool 00:05:07.748 end mempools------- 00:05:07.748 201 memzones totaling size 4.173523 MiB 00:05:07.748 size: 1.000366 MiB name: RG_ring_0_3795858 00:05:07.748 size: 1.000366 MiB name: RG_ring_1_3795858 00:05:07.748 size: 1.000366 MiB name: RG_ring_4_3795858 00:05:07.748 size: 1.000366 MiB name: RG_ring_5_3795858 00:05:07.748 size: 0.125366 MiB name: RG_ring_2_3795858 00:05:07.748 size: 0.015991 MiB name: RG_ring_3_3795858 00:05:07.748 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.6_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:01.7_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.6_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0c:02.7_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.6_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:01.7_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.6_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0d:02.7_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.6_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:01.7_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.0_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.1_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.2_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.3_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.4_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.5_qat 00:05:07.748 size: 0.000244 MiB name: 0000:0e:02.6_qat 00:05:07.749 size: 0.000244 MiB name: 0000:0e:02.7_qat 00:05:07.749 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_0 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_0 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_1 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_2 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_1 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_3 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_4 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_2 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_5 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_6 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_3 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_7 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_8 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_4 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_9 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_10 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_5 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_11 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_12 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_6 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_13 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_14 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_7 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_15 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_16 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_8 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_17 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_18 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_9 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_19 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_20 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_10 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_21 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_22 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_11 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_23 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_24 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_12 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_25 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_26 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_13 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_27 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_28 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_14 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_29 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_30 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_15 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_31 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_32 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_16 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_33 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_34 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_17 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_35 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_36 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_18 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_37 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_38 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_19 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_39 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_40 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_20 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_41 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_42 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_21 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_43 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_44 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_22 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_45 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_46 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_23 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_47 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_48 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_24 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_49 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_50 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_25 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_51 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_52 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_26 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_53 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_54 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_27 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_55 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_56 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_28 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_57 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_58 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_29 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_59 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_60 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_30 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_61 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_62 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_31 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_63 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_64 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_32 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_65 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_66 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_33 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_67 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_68 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_34 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_69 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_70 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_35 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_71 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_72 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_36 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_73 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_74 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_37 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_75 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_76 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_38 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_77 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_78 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_39 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_79 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_80 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_40 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_81 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_82 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_41 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_83 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_84 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_42 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_85 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_86 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_43 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_87 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_88 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_44 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_89 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_90 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_45 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_91 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_92 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_46 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_93 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_94 00:05:07.749 size: 0.000122 MiB name: rte_compressdev_data_47 00:05:07.749 size: 0.000122 MiB name: rte_cryptodev_data_95 00:05:07.749 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:05:07.749 end memzones------- 00:05:07.749 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:07.749 heap id: 0 total size: 814.000000 MiB number of busy elements: 668 number of free elements: 14 00:05:07.749 list of free elements. size: 11.778625 MiB 00:05:07.749 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:07.749 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:07.749 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:07.749 element at address: 0x200003e00000 with size: 0.996460 MiB 00:05:07.749 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:07.749 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:07.749 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:07.749 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:07.749 element at address: 0x20001aa00000 with size: 0.563843 MiB 00:05:07.749 element at address: 0x200003a00000 with size: 0.493042 MiB 00:05:07.749 element at address: 0x20000b200000 with size: 0.488892 MiB 00:05:07.749 element at address: 0x200000800000 with size: 0.486145 MiB 00:05:07.749 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:07.749 element at address: 0x200027e00000 with size: 0.395752 MiB 00:05:07.749 list of standard malloc elements. size: 199.904297 MiB 00:05:07.749 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:07.749 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:07.749 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:07.749 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:07.749 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:07.749 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:07.749 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:07.749 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:07.749 element at address: 0x20000032c840 with size: 0.004395 MiB 00:05:07.749 element at address: 0x2000003302c0 with size: 0.004395 MiB 00:05:07.749 element at address: 0x200000333d40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003377c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000033b240 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000033ecc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000342740 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003461c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000349c40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000034d6c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000351140 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000354bc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000358640 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000035c0c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000035fb40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003635c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000367040 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000036aac0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000036e540 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000371fc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000375a40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003794c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000037cf40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003809c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000384440 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000387ec0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000038b940 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000038f3c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x200000392e40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003968c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000039a340 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000039ddc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003a1840 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003a52c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003a8d40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003ac7c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003b0240 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003b3cc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003b7740 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003bb1c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003bec40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003c26c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003c6140 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003c9bc0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003cd640 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003d10c0 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003d4b40 with size: 0.004395 MiB 00:05:07.750 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:05:07.750 element at address: 0x20000032a740 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000032b7c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000032e1c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000032f240 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000331c40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000332cc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003356c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000336740 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000339140 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000033a1c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000033cbc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000033dc40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000340640 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003416c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003440c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000345140 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000347b40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000348bc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000034b5c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000034c640 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000034f040 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003500c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000352ac0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000353b40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000356540 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003575c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000359fc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000035b040 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000035da40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000035eac0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003614c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000362540 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000364f40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000365fc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003689c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000369a40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000036c440 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000036d4c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000036fec0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000370f40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000373940 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003749c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003773c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000378440 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000037ae40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000037bec0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000037e8c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000037f940 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000382340 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003833c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000385dc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000386e40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000389840 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000038a8c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000038d2c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000038e340 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000390d40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000391dc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003947c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000395840 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000398240 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003992c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000039bcc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000039cd40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x20000039f740 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003a07c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003a31c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003a6c40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003a7cc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003aa6c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003ab740 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003ae140 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003af1c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003b1bc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003b2c40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003b5640 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003b66c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003b90c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003ba140 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003bcb40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003bdbc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c05c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c1640 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c4040 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c50c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c7ac0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003c8b40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003cb540 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003cc5c0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003cefc0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003d0040 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003d2a40 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003d3ac0 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:05:07.750 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:05:07.750 element at address: 0x200000200000 with size: 0.000305 MiB 00:05:07.750 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:07.750 element at address: 0x200000200140 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200200 with size: 0.000183 MiB 00:05:07.750 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200380 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200440 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200500 with size: 0.000183 MiB 00:05:07.750 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200680 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200740 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200800 with size: 0.000183 MiB 00:05:07.750 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200980 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200a40 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000200c40 with size: 0.000183 MiB 00:05:07.750 element at address: 0x200000204f00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002251c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225280 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225340 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225400 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002254c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225580 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225640 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225700 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225880 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225940 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225a00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225b80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225c40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225d00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225e80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000225f40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226000 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226180 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226240 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226300 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002263c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226480 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226540 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226600 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002266c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226780 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226980 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226a40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226b00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226c80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226d40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226e00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000226f80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227040 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227100 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002271c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227280 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227340 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227400 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002274c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227580 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227640 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227700 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000002277c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227880 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227940 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227a00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227ac0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227b80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227c40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000227d00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000329f00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000329fc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032a180 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032a340 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032a400 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032da40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032dc00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032ddc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000032de80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003314c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000331680 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000331840 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000331900 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000334f40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000335100 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000335380 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003389c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000338b80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000338d40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000338e00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000033c440 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000033c600 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000033c7c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000033c880 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000033fec0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000340080 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000340240 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000340300 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000343940 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000343b00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000343cc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000343d80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003473c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000347580 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000347740 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000347800 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034ae40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034b000 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034b1c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034b280 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034e8c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034ea80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034ec40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000034ed00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000352340 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000352500 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003526c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000352780 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000355dc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000355f80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000356140 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000356200 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000359840 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000359a00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000359bc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000359c80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000035d2c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000035d480 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000035d640 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000035d700 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000360d40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000360f00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003610c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000361180 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003647c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000364980 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000364b40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000364c00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000368240 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000368400 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003685c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000368680 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036bcc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036be80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036c040 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036c100 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036f740 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036f900 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036fac0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000036fb80 with size: 0.000183 MiB 00:05:07.751 element at address: 0x2000003731c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000373380 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000373540 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000373600 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000376c40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000376e00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000376fc0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000377080 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037a6c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037a880 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037aa40 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037ab00 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037e140 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037e300 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037e4c0 with size: 0.000183 MiB 00:05:07.751 element at address: 0x20000037e580 with size: 0.000183 MiB 00:05:07.751 element at address: 0x200000381bc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000381d80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000381f40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000382000 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000385640 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000385800 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003859c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000385a80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003890c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000389280 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000389440 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000389500 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000038cb40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000038cd00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000038cec0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000038cf80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003905c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000390780 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000390940 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000390a00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000394040 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000394200 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003943c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000394480 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000397ac0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000397c80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000397e40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200000397f00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039b540 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039b700 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039b8c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039b980 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039efc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039f180 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039f340 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000039f400 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a2a40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a2c00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a2dc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a2e80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a64c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a6680 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a6840 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a6900 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003a9f40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003aa100 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003aa2c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003aa380 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003ad9c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003adb80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003add40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003ade00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b1440 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b1600 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b17c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b1880 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b5240 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b5300 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b8940 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b8b00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b8cc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003b8d80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003bc3c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003bc580 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003bc740 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003bc800 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c0000 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c01c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c0280 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c38c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c3a80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c3c40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c3d00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c7340 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c7500 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c76c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003c7780 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cadc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003caf80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cb140 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cb200 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003ce840 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003cec80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d22c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d2480 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d2640 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d2700 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d5e80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d6100 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d6800 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000003d68c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087c740 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087c800 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087c980 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e380 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e440 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e500 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e5c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e680 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e740 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e800 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e8c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7e980 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ea40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7eb00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:05:07.752 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d280 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:07.752 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:07.752 element at address: 0x20001aa90580 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90640 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90700 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa907c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90880 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:07.753 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e65500 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:05:07.753 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:07.754 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:07.754 list of memzone associated elements. size: 602.317078 MiB 00:05:07.754 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:07.754 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:07.754 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:07.754 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:07.754 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:07.754 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3795858_0 00:05:07.754 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:07.754 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3795858_0 00:05:07.754 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:07.754 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3795858_0 00:05:07.754 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:07.754 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:07.754 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:07.754 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:07.754 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:07.754 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3795858 00:05:07.754 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:07.754 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3795858 00:05:07.754 element at address: 0x200000227dc0 with size: 1.008118 MiB 00:05:07.754 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3795858 00:05:07.754 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:07.754 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:07.754 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:07.754 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:07.754 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:07.754 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:07.754 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:07.754 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:07.754 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:07.754 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3795858 00:05:07.754 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:07.754 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3795858 00:05:07.754 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:07.754 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3795858 00:05:07.754 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:07.754 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3795858 00:05:07.754 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:05:07.754 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3795858 00:05:07.754 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:07.754 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:07.754 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:07.754 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:07.754 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:07.754 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:07.754 element at address: 0x200000204fc0 with size: 0.125488 MiB 00:05:07.754 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3795858 00:05:07.754 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:07.754 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:07.754 element at address: 0x200027e65680 with size: 0.023743 MiB 00:05:07.754 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:07.754 element at address: 0x200000200d00 with size: 0.016113 MiB 00:05:07.754 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3795858 00:05:07.754 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:05:07.754 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:07.754 element at address: 0x2000003d62c0 with size: 0.001282 MiB 00:05:07.754 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:05:07.754 element at address: 0x2000003d6a80 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.0_qat 00:05:07.754 element at address: 0x2000003d28c0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.1_qat 00:05:07.754 element at address: 0x2000003cee40 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.2_qat 00:05:07.754 element at address: 0x2000003cb3c0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.3_qat 00:05:07.754 element at address: 0x2000003c7940 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.4_qat 00:05:07.754 element at address: 0x2000003c3ec0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.5_qat 00:05:07.754 element at address: 0x2000003c0440 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.6_qat 00:05:07.754 element at address: 0x2000003bc9c0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:01.7_qat 00:05:07.754 element at address: 0x2000003b8f40 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.0_qat 00:05:07.754 element at address: 0x2000003b54c0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.1_qat 00:05:07.754 element at address: 0x2000003b1a40 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.2_qat 00:05:07.754 element at address: 0x2000003adfc0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.3_qat 00:05:07.754 element at address: 0x2000003aa540 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.4_qat 00:05:07.754 element at address: 0x2000003a6ac0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.5_qat 00:05:07.754 element at address: 0x2000003a3040 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.6_qat 00:05:07.754 element at address: 0x20000039f5c0 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0c:02.7_qat 00:05:07.754 element at address: 0x20000039bb40 with size: 0.000366 MiB 00:05:07.754 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.0_qat 00:05:07.754 element at address: 0x2000003980c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.1_qat 00:05:07.755 element at address: 0x200000394640 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.2_qat 00:05:07.755 element at address: 0x200000390bc0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.3_qat 00:05:07.755 element at address: 0x20000038d140 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.4_qat 00:05:07.755 element at address: 0x2000003896c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.5_qat 00:05:07.755 element at address: 0x200000385c40 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.6_qat 00:05:07.755 element at address: 0x2000003821c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:01.7_qat 00:05:07.755 element at address: 0x20000037e740 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.0_qat 00:05:07.755 element at address: 0x20000037acc0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.1_qat 00:05:07.755 element at address: 0x200000377240 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.2_qat 00:05:07.755 element at address: 0x2000003737c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.3_qat 00:05:07.755 element at address: 0x20000036fd40 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.4_qat 00:05:07.755 element at address: 0x20000036c2c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.5_qat 00:05:07.755 element at address: 0x200000368840 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.6_qat 00:05:07.755 element at address: 0x200000364dc0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0d:02.7_qat 00:05:07.755 element at address: 0x200000361340 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.0_qat 00:05:07.755 element at address: 0x20000035d8c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.1_qat 00:05:07.755 element at address: 0x200000359e40 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.2_qat 00:05:07.755 element at address: 0x2000003563c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.3_qat 00:05:07.755 element at address: 0x200000352940 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.4_qat 00:05:07.755 element at address: 0x20000034eec0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.5_qat 00:05:07.755 element at address: 0x20000034b440 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.6_qat 00:05:07.755 element at address: 0x2000003479c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:01.7_qat 00:05:07.755 element at address: 0x200000343f40 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.0_qat 00:05:07.755 element at address: 0x2000003404c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.1_qat 00:05:07.755 element at address: 0x20000033ca40 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.2_qat 00:05:07.755 element at address: 0x200000338fc0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.3_qat 00:05:07.755 element at address: 0x200000335540 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.4_qat 00:05:07.755 element at address: 0x200000331ac0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.5_qat 00:05:07.755 element at address: 0x20000032e040 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.6_qat 00:05:07.755 element at address: 0x20000032a5c0 with size: 0.000366 MiB 00:05:07.755 associated memzone info: size: 0.000244 MiB name: 0000:0e:02.7_qat 00:05:07.755 element at address: 0x2000003d5d40 with size: 0.000305 MiB 00:05:07.755 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:05:07.755 element at address: 0x200000226840 with size: 0.000305 MiB 00:05:07.755 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3795858 00:05:07.755 element at address: 0x200000200b00 with size: 0.000305 MiB 00:05:07.755 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3795858 00:05:07.755 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:05:07.755 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:07.755 element at address: 0x2000003d6980 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:05:07.755 element at address: 0x2000003d61c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:05:07.755 element at address: 0x2000003d5f40 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:05:07.755 element at address: 0x2000003d27c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:05:07.755 element at address: 0x2000003d2540 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:05:07.755 element at address: 0x2000003d2380 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:05:07.755 element at address: 0x2000003ced40 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:05:07.755 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:05:07.755 element at address: 0x2000003ce900 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:05:07.755 element at address: 0x2000003cb2c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:05:07.755 element at address: 0x2000003cb040 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:05:07.755 element at address: 0x2000003cae80 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:05:07.755 element at address: 0x2000003c7840 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:05:07.755 element at address: 0x2000003c75c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:05:07.755 element at address: 0x2000003c7400 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:05:07.755 element at address: 0x2000003c3dc0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:05:07.755 element at address: 0x2000003c3b40 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:05:07.755 element at address: 0x2000003c3980 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:05:07.755 element at address: 0x2000003c0340 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:05:07.755 element at address: 0x2000003c00c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:05:07.755 element at address: 0x2000003bff00 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:05:07.755 element at address: 0x2000003bc8c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:05:07.755 element at address: 0x2000003bc640 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:05:07.755 element at address: 0x2000003bc480 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:05:07.755 element at address: 0x2000003b8e40 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:05:07.755 element at address: 0x2000003b8bc0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:05:07.755 element at address: 0x2000003b8a00 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:05:07.755 element at address: 0x2000003b53c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:05:07.755 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:05:07.755 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:05:07.755 element at address: 0x2000003b1940 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:05:07.755 element at address: 0x2000003b16c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:05:07.755 element at address: 0x2000003b1500 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:05:07.755 element at address: 0x2000003adec0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:05:07.755 element at address: 0x2000003adc40 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:05:07.755 element at address: 0x2000003ada80 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:05:07.755 element at address: 0x2000003aa440 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:05:07.755 element at address: 0x2000003aa1c0 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:05:07.755 element at address: 0x2000003aa000 with size: 0.000244 MiB 00:05:07.755 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:05:07.755 element at address: 0x2000003a69c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:05:07.756 element at address: 0x2000003a6740 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:05:07.756 element at address: 0x2000003a6580 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:05:07.756 element at address: 0x2000003a2f40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:05:07.756 element at address: 0x2000003a2cc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:05:07.756 element at address: 0x2000003a2b00 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:05:07.756 element at address: 0x20000039f4c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:05:07.756 element at address: 0x20000039f240 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:05:07.756 element at address: 0x20000039f080 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:05:07.756 element at address: 0x20000039ba40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:05:07.756 element at address: 0x20000039b7c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:05:07.756 element at address: 0x20000039b600 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:05:07.756 element at address: 0x200000397fc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:05:07.756 element at address: 0x200000397d40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:05:07.756 element at address: 0x200000397b80 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:05:07.756 element at address: 0x200000394540 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:05:07.756 element at address: 0x2000003942c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:05:07.756 element at address: 0x200000394100 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:05:07.756 element at address: 0x200000390ac0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:05:07.756 element at address: 0x200000390840 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:05:07.756 element at address: 0x200000390680 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:05:07.756 element at address: 0x20000038d040 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:05:07.756 element at address: 0x20000038cdc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:05:07.756 element at address: 0x20000038cc00 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:05:07.756 element at address: 0x2000003895c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:05:07.756 element at address: 0x200000389340 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:05:07.756 element at address: 0x200000389180 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:05:07.756 element at address: 0x200000385b40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:05:07.756 element at address: 0x2000003858c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:05:07.756 element at address: 0x200000385700 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:05:07.756 element at address: 0x2000003820c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:05:07.756 element at address: 0x200000381e40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:05:07.756 element at address: 0x200000381c80 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:05:07.756 element at address: 0x20000037e640 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:05:07.756 element at address: 0x20000037e3c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:05:07.756 element at address: 0x20000037e200 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:05:07.756 element at address: 0x20000037abc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:05:07.756 element at address: 0x20000037a940 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:05:07.756 element at address: 0x20000037a780 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:05:07.756 element at address: 0x200000377140 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:05:07.756 element at address: 0x200000376ec0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:05:07.756 element at address: 0x200000376d00 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:05:07.756 element at address: 0x2000003736c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:05:07.756 element at address: 0x200000373440 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:05:07.756 element at address: 0x200000373280 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:05:07.756 element at address: 0x20000036fc40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:05:07.756 element at address: 0x20000036f9c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:05:07.756 element at address: 0x20000036f800 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:05:07.756 element at address: 0x20000036c1c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:05:07.756 element at address: 0x20000036bf40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:05:07.756 element at address: 0x20000036bd80 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:05:07.756 element at address: 0x200000368740 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:05:07.756 element at address: 0x2000003684c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:05:07.756 element at address: 0x200000368300 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:05:07.756 element at address: 0x200000364cc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:05:07.756 element at address: 0x200000364a40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:05:07.756 element at address: 0x200000364880 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:05:07.756 element at address: 0x200000361240 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:05:07.756 element at address: 0x200000360fc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:05:07.756 element at address: 0x200000360e00 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:05:07.756 element at address: 0x20000035d7c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:05:07.756 element at address: 0x20000035d540 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:05:07.756 element at address: 0x20000035d380 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:05:07.756 element at address: 0x200000359d40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:05:07.756 element at address: 0x200000359ac0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:05:07.756 element at address: 0x200000359900 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:05:07.756 element at address: 0x2000003562c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:05:07.756 element at address: 0x200000356040 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:05:07.756 element at address: 0x200000355e80 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:05:07.756 element at address: 0x200000352840 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:05:07.756 element at address: 0x2000003525c0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:05:07.756 element at address: 0x200000352400 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:05:07.756 element at address: 0x20000034edc0 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:05:07.756 element at address: 0x20000034eb40 with size: 0.000244 MiB 00:05:07.756 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:05:07.757 element at address: 0x20000034e980 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:05:07.757 element at address: 0x20000034b340 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:05:07.757 element at address: 0x20000034b0c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:05:07.757 element at address: 0x20000034af00 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:05:07.757 element at address: 0x2000003478c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:05:07.757 element at address: 0x200000347640 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:05:07.757 element at address: 0x200000347480 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:05:07.757 element at address: 0x200000343e40 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:05:07.757 element at address: 0x200000343bc0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:05:07.757 element at address: 0x200000343a00 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:05:07.757 element at address: 0x2000003403c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:05:07.757 element at address: 0x200000340140 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:05:07.757 element at address: 0x20000033ff80 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:05:07.757 element at address: 0x20000033c940 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:05:07.757 element at address: 0x20000033c6c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:05:07.757 element at address: 0x20000033c500 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:05:07.757 element at address: 0x200000338ec0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:05:07.757 element at address: 0x200000338c40 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:05:07.757 element at address: 0x200000338a80 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:05:07.757 element at address: 0x200000335440 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:05:07.757 element at address: 0x2000003351c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:05:07.757 element at address: 0x200000335000 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:05:07.757 element at address: 0x2000003319c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:05:07.757 element at address: 0x200000331740 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:05:07.757 element at address: 0x200000331580 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:05:07.757 element at address: 0x20000032df40 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:05:07.757 element at address: 0x20000032dcc0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:05:07.757 element at address: 0x20000032db00 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:05:07.757 element at address: 0x20000032a4c0 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:05:07.757 element at address: 0x20000032a240 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:05:07.757 element at address: 0x20000032a080 with size: 0.000244 MiB 00:05:07.757 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:05:07.757 element at address: 0x2000003d6040 with size: 0.000183 MiB 00:05:07.757 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:05:07.757 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:07.757 04:05:55 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3795858 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@946 -- # '[' -z 3795858 ']' 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@950 -- # kill -0 3795858 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@951 -- # uname 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3795858 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3795858' 00:05:07.757 killing process with pid 3795858 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@965 -- # kill 3795858 00:05:07.757 04:05:55 dpdk_mem_utility -- common/autotest_common.sh@970 -- # wait 3795858 00:05:08.324 00:05:08.324 real 0m1.709s 00:05:08.324 user 0m1.901s 00:05:08.324 sys 0m0.480s 00:05:08.324 04:05:56 dpdk_mem_utility -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:08.324 04:05:56 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:08.324 ************************************ 00:05:08.324 END TEST dpdk_mem_utility 00:05:08.324 ************************************ 00:05:08.324 04:05:56 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:05:08.324 04:05:56 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:08.324 04:05:56 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:08.324 04:05:56 -- common/autotest_common.sh@10 -- # set +x 00:05:08.324 ************************************ 00:05:08.324 START TEST event 00:05:08.324 ************************************ 00:05:08.324 04:05:56 event -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:05:08.324 * Looking for test storage... 00:05:08.324 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:05:08.324 04:05:56 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:08.324 04:05:56 event -- bdev/nbd_common.sh@6 -- # set -e 00:05:08.324 04:05:56 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:08.324 04:05:56 event -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:05:08.324 04:05:56 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:08.324 04:05:56 event -- common/autotest_common.sh@10 -- # set +x 00:05:08.324 ************************************ 00:05:08.324 START TEST event_perf 00:05:08.324 ************************************ 00:05:08.324 04:05:56 event.event_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:08.324 Running I/O for 1 seconds...[2024-05-15 04:05:56.291414] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:08.324 [2024-05-15 04:05:56.291476] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3796179 ] 00:05:08.583 [2024-05-15 04:05:56.373713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:08.583 [2024-05-15 04:05:56.492520] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:08.583 [2024-05-15 04:05:56.492574] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:08.583 [2024-05-15 04:05:56.492689] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:08.583 [2024-05-15 04:05:56.492693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.962 Running I/O for 1 seconds... 00:05:09.962 lcore 0: 230480 00:05:09.962 lcore 1: 230479 00:05:09.962 lcore 2: 230479 00:05:09.962 lcore 3: 230481 00:05:09.962 done. 00:05:09.962 00:05:09.962 real 0m1.345s 00:05:09.962 user 0m4.224s 00:05:09.962 sys 0m0.115s 00:05:09.962 04:05:57 event.event_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:09.962 04:05:57 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:05:09.962 ************************************ 00:05:09.962 END TEST event_perf 00:05:09.962 ************************************ 00:05:09.962 04:05:57 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:09.962 04:05:57 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:05:09.962 04:05:57 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:09.962 04:05:57 event -- common/autotest_common.sh@10 -- # set +x 00:05:09.962 ************************************ 00:05:09.962 START TEST event_reactor 00:05:09.962 ************************************ 00:05:09.962 04:05:57 event.event_reactor -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:09.962 [2024-05-15 04:05:57.689593] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:09.962 [2024-05-15 04:05:57.689653] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3796342 ] 00:05:09.962 [2024-05-15 04:05:57.771855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:09.962 [2024-05-15 04:05:57.890466] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:11.338 test_start 00:05:11.338 oneshot 00:05:11.338 tick 100 00:05:11.338 tick 100 00:05:11.338 tick 250 00:05:11.338 tick 100 00:05:11.338 tick 100 00:05:11.338 tick 100 00:05:11.338 tick 250 00:05:11.338 tick 500 00:05:11.338 tick 100 00:05:11.338 tick 100 00:05:11.338 tick 250 00:05:11.338 tick 100 00:05:11.338 tick 100 00:05:11.338 test_end 00:05:11.338 00:05:11.338 real 0m1.345s 00:05:11.338 user 0m1.237s 00:05:11.338 sys 0m0.102s 00:05:11.338 04:05:59 event.event_reactor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:11.338 04:05:59 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:05:11.338 ************************************ 00:05:11.338 END TEST event_reactor 00:05:11.338 ************************************ 00:05:11.338 04:05:59 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:11.338 04:05:59 event -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:05:11.338 04:05:59 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:11.338 04:05:59 event -- common/autotest_common.sh@10 -- # set +x 00:05:11.338 ************************************ 00:05:11.338 START TEST event_reactor_perf 00:05:11.338 ************************************ 00:05:11.338 04:05:59 event.event_reactor_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:11.338 [2024-05-15 04:05:59.086252] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:11.338 [2024-05-15 04:05:59.086313] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3796494 ] 00:05:11.338 [2024-05-15 04:05:59.168367] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:11.338 [2024-05-15 04:05:59.284503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:12.714 test_start 00:05:12.714 test_end 00:05:12.714 Performance: 353505 events per second 00:05:12.714 00:05:12.714 real 0m1.337s 00:05:12.714 user 0m1.236s 00:05:12.714 sys 0m0.094s 00:05:12.714 04:06:00 event.event_reactor_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:12.714 04:06:00 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:05:12.714 ************************************ 00:05:12.714 END TEST event_reactor_perf 00:05:12.714 ************************************ 00:05:12.714 04:06:00 event -- event/event.sh@49 -- # uname -s 00:05:12.714 04:06:00 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:12.714 04:06:00 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:12.714 04:06:00 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:12.714 04:06:00 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:12.714 04:06:00 event -- common/autotest_common.sh@10 -- # set +x 00:05:12.714 ************************************ 00:05:12.714 START TEST event_scheduler 00:05:12.714 ************************************ 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:12.714 * Looking for test storage... 00:05:12.714 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:05:12.714 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:12.714 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=3796801 00:05:12.714 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:12.714 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:12.714 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 3796801 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@827 -- # '[' -z 3796801 ']' 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:12.714 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:12.714 04:06:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:12.714 [2024-05-15 04:06:00.552322] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:12.715 [2024-05-15 04:06:00.552415] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3796801 ] 00:05:12.715 [2024-05-15 04:06:00.664054] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:12.972 [2024-05-15 04:06:00.810633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:12.972 [2024-05-15 04:06:00.810697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:12.972 [2024-05-15 04:06:00.810761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:12.972 [2024-05-15 04:06:00.810769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@860 -- # return 0 00:05:12.972 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:12.972 POWER: Env isn't set yet! 00:05:12.972 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:12.972 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_available_frequencies 00:05:12.972 POWER: Cannot get available frequencies of lcore 0 00:05:12.972 POWER: Attempting to initialise PSTAT power management... 00:05:12.972 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:12.972 POWER: Initialized successfully for lcore 0 power management 00:05:12.972 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:12.972 POWER: Initialized successfully for lcore 1 power management 00:05:12.972 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:12.972 POWER: Initialized successfully for lcore 2 power management 00:05:12.972 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:12.972 POWER: Initialized successfully for lcore 3 power management 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:12.972 04:06:00 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:12.972 04:06:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 [2024-05-15 04:06:01.021011] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:13.229 04:06:01 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.229 04:06:01 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:13.229 04:06:01 event.event_scheduler -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:13.229 04:06:01 event.event_scheduler -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:13.229 04:06:01 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 ************************************ 00:05:13.229 START TEST scheduler_create_thread 00:05:13.229 ************************************ 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1121 -- # scheduler_create_thread 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 2 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 3 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 4 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.229 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.229 5 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 6 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 7 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 8 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 9 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 10 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:13.230 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.797 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:13.797 00:05:13.797 real 0m0.589s 00:05:13.797 user 0m0.012s 00:05:13.797 sys 0m0.003s 00:05:13.797 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:13.797 04:06:01 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:13.797 ************************************ 00:05:13.797 END TEST scheduler_create_thread 00:05:13.797 ************************************ 00:05:13.797 04:06:01 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:13.798 04:06:01 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 3796801 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@946 -- # '[' -z 3796801 ']' 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@950 -- # kill -0 3796801 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@951 -- # uname 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3796801 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@952 -- # process_name=reactor_2 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@956 -- # '[' reactor_2 = sudo ']' 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3796801' 00:05:13.798 killing process with pid 3796801 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@965 -- # kill 3796801 00:05:13.798 04:06:01 event.event_scheduler -- common/autotest_common.sh@970 -- # wait 3796801 00:05:14.366 [2024-05-15 04:06:02.117618] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:14.366 POWER: Power management governor of lcore 0 has been set to 'userspace' successfully 00:05:14.366 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:14.366 POWER: Power management governor of lcore 1 has been set to 'schedutil' successfully 00:05:14.366 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:14.366 POWER: Power management governor of lcore 2 has been set to 'schedutil' successfully 00:05:14.366 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:14.366 POWER: Power management governor of lcore 3 has been set to 'schedutil' successfully 00:05:14.366 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:14.624 00:05:14.624 real 0m1.926s 00:05:14.624 user 0m2.534s 00:05:14.625 sys 0m0.385s 00:05:14.625 04:06:02 event.event_scheduler -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:14.625 04:06:02 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:14.625 ************************************ 00:05:14.625 END TEST event_scheduler 00:05:14.625 ************************************ 00:05:14.625 04:06:02 event -- event/event.sh@51 -- # modprobe -n nbd 00:05:14.625 04:06:02 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:14.625 04:06:02 event -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:14.625 04:06:02 event -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:14.625 04:06:02 event -- common/autotest_common.sh@10 -- # set +x 00:05:14.625 ************************************ 00:05:14.625 START TEST app_repeat 00:05:14.625 ************************************ 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@1121 -- # app_repeat_test 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@19 -- # repeat_pid=3797101 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3797101' 00:05:14.625 Process app_repeat pid: 3797101 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:14.625 spdk_app_start Round 0 00:05:14.625 04:06:02 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3797101 /var/tmp/spdk-nbd.sock 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 3797101 ']' 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:14.625 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:14.625 04:06:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:14.625 [2024-05-15 04:06:02.475864] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:14.625 [2024-05-15 04:06:02.475942] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3797101 ] 00:05:14.625 [2024-05-15 04:06:02.558791] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:14.883 [2024-05-15 04:06:02.677372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:14.883 [2024-05-15 04:06:02.677378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.883 04:06:02 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:14.883 04:06:02 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:05:14.883 04:06:02 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:15.140 Malloc0 00:05:15.140 04:06:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:15.397 Malloc1 00:05:15.397 04:06:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:15.397 04:06:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:15.655 /dev/nbd0 00:05:15.655 04:06:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:15.655 04:06:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:15.655 1+0 records in 00:05:15.655 1+0 records out 00:05:15.655 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000147164 s, 27.8 MB/s 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:15.655 04:06:03 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:15.655 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:15.655 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:15.655 04:06:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:15.911 /dev/nbd1 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:15.911 1+0 records in 00:05:15.911 1+0 records out 00:05:15.911 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188852 s, 21.7 MB/s 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:15.911 04:06:03 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:15.911 04:06:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:16.168 04:06:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:16.168 { 00:05:16.168 "nbd_device": "/dev/nbd0", 00:05:16.168 "bdev_name": "Malloc0" 00:05:16.168 }, 00:05:16.168 { 00:05:16.168 "nbd_device": "/dev/nbd1", 00:05:16.168 "bdev_name": "Malloc1" 00:05:16.169 } 00:05:16.169 ]' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:16.169 { 00:05:16.169 "nbd_device": "/dev/nbd0", 00:05:16.169 "bdev_name": "Malloc0" 00:05:16.169 }, 00:05:16.169 { 00:05:16.169 "nbd_device": "/dev/nbd1", 00:05:16.169 "bdev_name": "Malloc1" 00:05:16.169 } 00:05:16.169 ]' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:16.169 /dev/nbd1' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:16.169 /dev/nbd1' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:16.169 256+0 records in 00:05:16.169 256+0 records out 00:05:16.169 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0051086 s, 205 MB/s 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:16.169 04:06:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:16.427 256+0 records in 00:05:16.427 256+0 records out 00:05:16.427 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208119 s, 50.4 MB/s 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:16.427 256+0 records in 00:05:16.427 256+0 records out 00:05:16.427 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0247856 s, 42.3 MB/s 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:16.427 04:06:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:16.686 04:06:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:16.686 04:06:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:16.686 04:06:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:16.686 04:06:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:16.687 04:06:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:16.945 04:06:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:17.204 04:06:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:17.204 04:06:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:17.204 04:06:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:17.204 04:06:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:17.204 04:06:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:17.464 04:06:05 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:17.723 [2024-05-15 04:06:05.614325] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:17.723 [2024-05-15 04:06:05.729883] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:17.723 [2024-05-15 04:06:05.729888] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.982 [2024-05-15 04:06:05.792601] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:17.982 [2024-05-15 04:06:05.792679] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:20.516 04:06:08 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:20.516 04:06:08 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:20.516 spdk_app_start Round 1 00:05:20.516 04:06:08 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3797101 /var/tmp/spdk-nbd.sock 00:05:20.516 04:06:08 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 3797101 ']' 00:05:20.516 04:06:08 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:20.516 04:06:08 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:20.516 04:06:08 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:20.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:20.517 04:06:08 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:20.517 04:06:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:20.775 04:06:08 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:20.775 04:06:08 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:05:20.775 04:06:08 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:21.033 Malloc0 00:05:21.033 04:06:08 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:21.324 Malloc1 00:05:21.324 04:06:09 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:21.324 04:06:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:21.607 /dev/nbd0 00:05:21.607 04:06:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:21.607 04:06:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:21.607 04:06:09 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:21.607 1+0 records in 00:05:21.607 1+0 records out 00:05:21.608 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000181435 s, 22.6 MB/s 00:05:21.608 04:06:09 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:21.608 04:06:09 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:21.608 04:06:09 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:21.608 04:06:09 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:21.608 04:06:09 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:21.608 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:21.608 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:21.608 04:06:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:21.869 /dev/nbd1 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:21.869 1+0 records in 00:05:21.869 1+0 records out 00:05:21.869 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200194 s, 20.5 MB/s 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:21.869 04:06:09 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:21.869 04:06:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:22.154 { 00:05:22.154 "nbd_device": "/dev/nbd0", 00:05:22.154 "bdev_name": "Malloc0" 00:05:22.154 }, 00:05:22.154 { 00:05:22.154 "nbd_device": "/dev/nbd1", 00:05:22.154 "bdev_name": "Malloc1" 00:05:22.154 } 00:05:22.154 ]' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:22.154 { 00:05:22.154 "nbd_device": "/dev/nbd0", 00:05:22.154 "bdev_name": "Malloc0" 00:05:22.154 }, 00:05:22.154 { 00:05:22.154 "nbd_device": "/dev/nbd1", 00:05:22.154 "bdev_name": "Malloc1" 00:05:22.154 } 00:05:22.154 ]' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:22.154 /dev/nbd1' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:22.154 /dev/nbd1' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:22.154 256+0 records in 00:05:22.154 256+0 records out 00:05:22.154 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00519025 s, 202 MB/s 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:22.154 256+0 records in 00:05:22.154 256+0 records out 00:05:22.154 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0227228 s, 46.1 MB/s 00:05:22.154 04:06:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:22.155 256+0 records in 00:05:22.155 256+0 records out 00:05:22.155 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0227846 s, 46.0 MB/s 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:22.155 04:06:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:22.155 04:06:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:22.413 04:06:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:22.671 04:06:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:22.929 04:06:10 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:22.929 04:06:10 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:23.186 04:06:11 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:23.444 [2024-05-15 04:06:11.388553] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:23.702 [2024-05-15 04:06:11.504924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:23.702 [2024-05-15 04:06:11.504930] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.702 [2024-05-15 04:06:11.568410] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:23.702 [2024-05-15 04:06:11.568491] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:26.238 04:06:14 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:26.238 04:06:14 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:26.238 spdk_app_start Round 2 00:05:26.238 04:06:14 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3797101 /var/tmp/spdk-nbd.sock 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 3797101 ']' 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:26.238 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:26.238 04:06:14 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:26.495 04:06:14 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:26.495 04:06:14 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:05:26.495 04:06:14 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:26.753 Malloc0 00:05:26.753 04:06:14 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:27.011 Malloc1 00:05:27.011 04:06:14 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.011 04:06:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:27.272 /dev/nbd0 00:05:27.272 04:06:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:27.272 04:06:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.272 1+0 records in 00:05:27.272 1+0 records out 00:05:27.272 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000160727 s, 25.5 MB/s 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:27.272 04:06:15 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:27.272 04:06:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.272 04:06:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.272 04:06:15 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:27.531 /dev/nbd1 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@865 -- # local i 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@869 -- # break 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:27.531 1+0 records in 00:05:27.531 1+0 records out 00:05:27.531 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000164186 s, 24.9 MB/s 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@882 -- # size=4096 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:05:27.531 04:06:15 event.app_repeat -- common/autotest_common.sh@885 -- # return 0 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.531 04:06:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:27.790 { 00:05:27.790 "nbd_device": "/dev/nbd0", 00:05:27.790 "bdev_name": "Malloc0" 00:05:27.790 }, 00:05:27.790 { 00:05:27.790 "nbd_device": "/dev/nbd1", 00:05:27.790 "bdev_name": "Malloc1" 00:05:27.790 } 00:05:27.790 ]' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:27.790 { 00:05:27.790 "nbd_device": "/dev/nbd0", 00:05:27.790 "bdev_name": "Malloc0" 00:05:27.790 }, 00:05:27.790 { 00:05:27.790 "nbd_device": "/dev/nbd1", 00:05:27.790 "bdev_name": "Malloc1" 00:05:27.790 } 00:05:27.790 ]' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:27.790 /dev/nbd1' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:27.790 /dev/nbd1' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:27.790 04:06:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:27.791 256+0 records in 00:05:27.791 256+0 records out 00:05:27.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.004979 s, 211 MB/s 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:27.791 256+0 records in 00:05:27.791 256+0 records out 00:05:27.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0235778 s, 44.5 MB/s 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:27.791 256+0 records in 00:05:27.791 256+0 records out 00:05:27.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0253424 s, 41.4 MB/s 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:27.791 04:06:15 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:28.049 04:06:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:28.049 04:06:16 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:28.308 04:06:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:28.566 04:06:16 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:28.566 04:06:16 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:28.825 04:06:16 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:29.394 [2024-05-15 04:06:17.126411] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:29.394 [2024-05-15 04:06:17.242487] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:29.394 [2024-05-15 04:06:17.242492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.394 [2024-05-15 04:06:17.304898] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:29.394 [2024-05-15 04:06:17.304963] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:31.924 04:06:19 event.app_repeat -- event/event.sh@38 -- # waitforlisten 3797101 /var/tmp/spdk-nbd.sock 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@827 -- # '[' -z 3797101 ']' 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:31.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:31.924 04:06:19 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:32.181 04:06:20 event.app_repeat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@860 -- # return 0 00:05:32.182 04:06:20 event.app_repeat -- event/event.sh@39 -- # killprocess 3797101 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@946 -- # '[' -z 3797101 ']' 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@950 -- # kill -0 3797101 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@951 -- # uname 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3797101 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3797101' 00:05:32.182 killing process with pid 3797101 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@965 -- # kill 3797101 00:05:32.182 04:06:20 event.app_repeat -- common/autotest_common.sh@970 -- # wait 3797101 00:05:32.439 spdk_app_start is called in Round 0. 00:05:32.439 Shutdown signal received, stop current app iteration 00:05:32.439 Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 reinitialization... 00:05:32.439 spdk_app_start is called in Round 1. 00:05:32.439 Shutdown signal received, stop current app iteration 00:05:32.439 Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 reinitialization... 00:05:32.439 spdk_app_start is called in Round 2. 00:05:32.439 Shutdown signal received, stop current app iteration 00:05:32.439 Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 reinitialization... 00:05:32.439 spdk_app_start is called in Round 3. 00:05:32.439 Shutdown signal received, stop current app iteration 00:05:32.439 04:06:20 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:32.439 04:06:20 event.app_repeat -- event/event.sh@42 -- # return 0 00:05:32.439 00:05:32.439 real 0m17.925s 00:05:32.439 user 0m39.019s 00:05:32.439 sys 0m3.414s 00:05:32.439 04:06:20 event.app_repeat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:32.439 04:06:20 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:32.439 ************************************ 00:05:32.440 END TEST app_repeat 00:05:32.440 ************************************ 00:05:32.440 04:06:20 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:32.440 00:05:32.440 real 0m24.210s 00:05:32.440 user 0m48.386s 00:05:32.440 sys 0m4.318s 00:05:32.440 04:06:20 event -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:32.440 04:06:20 event -- common/autotest_common.sh@10 -- # set +x 00:05:32.440 ************************************ 00:05:32.440 END TEST event 00:05:32.440 ************************************ 00:05:32.440 04:06:20 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:05:32.440 04:06:20 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:32.440 04:06:20 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:32.440 04:06:20 -- common/autotest_common.sh@10 -- # set +x 00:05:32.440 ************************************ 00:05:32.440 START TEST thread 00:05:32.440 ************************************ 00:05:32.440 04:06:20 thread -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:05:32.698 * Looking for test storage... 00:05:32.698 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:05:32.698 04:06:20 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:32.698 04:06:20 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:05:32.698 04:06:20 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:32.698 04:06:20 thread -- common/autotest_common.sh@10 -- # set +x 00:05:32.698 ************************************ 00:05:32.698 START TEST thread_poller_perf 00:05:32.698 ************************************ 00:05:32.698 04:06:20 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:32.698 [2024-05-15 04:06:20.547289] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:32.698 [2024-05-15 04:06:20.547351] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3799976 ] 00:05:32.699 [2024-05-15 04:06:20.625994] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.957 [2024-05-15 04:06:20.741629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.957 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:33.890 ====================================== 00:05:33.890 busy:2711224763 (cyc) 00:05:33.890 total_run_count: 292000 00:05:33.890 tsc_hz: 2700000000 (cyc) 00:05:33.890 ====================================== 00:05:33.890 poller_cost: 9285 (cyc), 3438 (nsec) 00:05:33.890 00:05:33.890 real 0m1.347s 00:05:33.890 user 0m1.235s 00:05:33.890 sys 0m0.107s 00:05:33.890 04:06:21 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:33.890 04:06:21 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:05:33.890 ************************************ 00:05:33.890 END TEST thread_poller_perf 00:05:33.890 ************************************ 00:05:33.890 04:06:21 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:33.890 04:06:21 thread -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:05:33.890 04:06:21 thread -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:33.890 04:06:21 thread -- common/autotest_common.sh@10 -- # set +x 00:05:34.148 ************************************ 00:05:34.148 START TEST thread_poller_perf 00:05:34.148 ************************************ 00:05:34.148 04:06:21 thread.thread_poller_perf -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:34.148 [2024-05-15 04:06:21.947121] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:34.148 [2024-05-15 04:06:21.947188] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3800208 ] 00:05:34.148 [2024-05-15 04:06:22.031916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.148 [2024-05-15 04:06:22.153083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.148 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:35.521 ====================================== 00:05:35.521 busy:2702677835 (cyc) 00:05:35.521 total_run_count: 3941000 00:05:35.521 tsc_hz: 2700000000 (cyc) 00:05:35.521 ====================================== 00:05:35.521 poller_cost: 685 (cyc), 253 (nsec) 00:05:35.521 00:05:35.521 real 0m1.349s 00:05:35.521 user 0m1.249s 00:05:35.521 sys 0m0.094s 00:05:35.521 04:06:23 thread.thread_poller_perf -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:35.521 04:06:23 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:05:35.521 ************************************ 00:05:35.521 END TEST thread_poller_perf 00:05:35.521 ************************************ 00:05:35.521 04:06:23 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:05:35.521 00:05:35.521 real 0m2.850s 00:05:35.521 user 0m2.546s 00:05:35.521 sys 0m0.300s 00:05:35.521 04:06:23 thread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:35.521 04:06:23 thread -- common/autotest_common.sh@10 -- # set +x 00:05:35.521 ************************************ 00:05:35.521 END TEST thread 00:05:35.521 ************************************ 00:05:35.521 04:06:23 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:05:35.521 04:06:23 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:35.521 04:06:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:35.521 04:06:23 -- common/autotest_common.sh@10 -- # set +x 00:05:35.521 ************************************ 00:05:35.521 START TEST accel 00:05:35.521 ************************************ 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:05:35.521 * Looking for test storage... 00:05:35.521 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:05:35.521 04:06:23 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:05:35.521 04:06:23 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:05:35.521 04:06:23 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:35.521 04:06:23 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3800447 00:05:35.521 04:06:23 accel -- accel/accel.sh@63 -- # waitforlisten 3800447 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@827 -- # '[' -z 3800447 ']' 00:05:35.521 04:06:23 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.521 04:06:23 accel -- accel/accel.sh@61 -- # build_accel_config 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:35.521 04:06:23 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.521 04:06:23 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:35.521 04:06:23 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.521 04:06:23 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.521 04:06:23 accel -- common/autotest_common.sh@10 -- # set +x 00:05:35.521 04:06:23 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:35.521 04:06:23 accel -- accel/accel.sh@40 -- # local IFS=, 00:05:35.521 04:06:23 accel -- accel/accel.sh@41 -- # jq -r . 00:05:35.521 [2024-05-15 04:06:23.453084] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:35.521 [2024-05-15 04:06:23.453164] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3800447 ] 00:05:35.521 [2024-05-15 04:06:23.526916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.779 [2024-05-15 04:06:23.636368] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@860 -- # return 0 00:05:36.712 04:06:24 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:05:36.712 04:06:24 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:05:36.712 04:06:24 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:05:36.712 04:06:24 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:05:36.712 04:06:24 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:36.712 04:06:24 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:36.712 04:06:24 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@10 -- # set +x 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # IFS== 00:05:36.712 04:06:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:05:36.712 04:06:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:05:36.712 04:06:24 accel -- accel/accel.sh@75 -- # killprocess 3800447 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@946 -- # '[' -z 3800447 ']' 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@950 -- # kill -0 3800447 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@951 -- # uname 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3800447 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3800447' 00:05:36.712 killing process with pid 3800447 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@965 -- # kill 3800447 00:05:36.712 04:06:24 accel -- common/autotest_common.sh@970 -- # wait 3800447 00:05:36.971 04:06:24 accel -- accel/accel.sh@76 -- # trap - ERR 00:05:36.971 04:06:24 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:05:36.971 04:06:24 accel -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:05:36.971 04:06:24 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:36.971 04:06:24 accel -- common/autotest_common.sh@10 -- # set +x 00:05:36.971 04:06:24 accel.accel_help -- common/autotest_common.sh@1121 -- # accel_perf -h 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:05:36.971 04:06:24 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:05:36.971 04:06:24 accel.accel_help -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:36.971 04:06:24 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:05:37.229 04:06:24 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:05:37.229 04:06:24 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:37.229 04:06:24 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:37.229 04:06:24 accel -- common/autotest_common.sh@10 -- # set +x 00:05:37.229 ************************************ 00:05:37.229 START TEST accel_missing_filename 00:05:37.229 ************************************ 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:37.229 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:05:37.229 04:06:25 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:05:37.229 [2024-05-15 04:06:25.048187] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:37.229 [2024-05-15 04:06:25.048248] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3800631 ] 00:05:37.229 [2024-05-15 04:06:25.131141] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.487 [2024-05-15 04:06:25.249288] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.487 [2024-05-15 04:06:25.323801] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:37.487 [2024-05-15 04:06:25.411595] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:05:37.746 A filename is required. 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:37.746 00:05:37.746 real 0m0.515s 00:05:37.746 user 0m0.378s 00:05:37.746 sys 0m0.155s 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:37.746 04:06:25 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:05:37.746 ************************************ 00:05:37.746 END TEST accel_missing_filename 00:05:37.746 ************************************ 00:05:37.746 04:06:25 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:05:37.746 04:06:25 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:05:37.746 04:06:25 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:37.746 04:06:25 accel -- common/autotest_common.sh@10 -- # set +x 00:05:37.746 ************************************ 00:05:37.746 START TEST accel_compress_verify 00:05:37.746 ************************************ 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:37.746 04:06:25 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:05:37.746 04:06:25 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:05:37.746 [2024-05-15 04:06:25.615721] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:37.746 [2024-05-15 04:06:25.615779] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3800771 ] 00:05:37.746 [2024-05-15 04:06:25.696457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.004 [2024-05-15 04:06:25.814447] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.004 [2024-05-15 04:06:25.883268] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:38.004 [2024-05-15 04:06:25.965774] accel_perf.c:1393:main: *ERROR*: ERROR starting application 00:05:38.262 00:05:38.262 Compression does not support the verify option, aborting. 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:38.262 00:05:38.262 real 0m0.500s 00:05:38.262 user 0m0.361s 00:05:38.262 sys 0m0.163s 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:38.262 04:06:26 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:05:38.263 ************************************ 00:05:38.263 END TEST accel_compress_verify 00:05:38.263 ************************************ 00:05:38.263 04:06:26 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@10 -- # set +x 00:05:38.263 ************************************ 00:05:38.263 START TEST accel_wrong_workload 00:05:38.263 ************************************ 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w foobar 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:05:38.263 04:06:26 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:05:38.263 Unsupported workload type: foobar 00:05:38.263 [2024-05-15 04:06:26.162153] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:05:38.263 accel_perf options: 00:05:38.263 [-h help message] 00:05:38.263 [-q queue depth per core] 00:05:38.263 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:38.263 [-T number of threads per core 00:05:38.263 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:38.263 [-t time in seconds] 00:05:38.263 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:38.263 [ dif_verify, , dif_generate, dif_generate_copy 00:05:38.263 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:38.263 [-l for compress/decompress workloads, name of uncompressed input file 00:05:38.263 [-S for crc32c workload, use this seed value (default 0) 00:05:38.263 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:38.263 [-f for fill workload, use this BYTE value (default 255) 00:05:38.263 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:38.263 [-y verify result if this switch is on] 00:05:38.263 [-a tasks to allocate per core (default: same value as -q)] 00:05:38.263 Can be used to spread operations across a wider range of memory. 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:38.263 00:05:38.263 real 0m0.028s 00:05:38.263 user 0m0.019s 00:05:38.263 sys 0m0.009s 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:38.263 04:06:26 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:05:38.263 ************************************ 00:05:38.263 END TEST accel_wrong_workload 00:05:38.263 ************************************ 00:05:38.263 Error: writing output failed: Broken pipe 00:05:38.263 04:06:26 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@10 -- # set +x 00:05:38.263 ************************************ 00:05:38.263 START TEST accel_negative_buffers 00:05:38.263 ************************************ 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:05:38.263 04:06:26 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:05:38.263 -x option must be non-negative. 00:05:38.263 [2024-05-15 04:06:26.241268] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:05:38.263 accel_perf options: 00:05:38.263 [-h help message] 00:05:38.263 [-q queue depth per core] 00:05:38.263 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:38.263 [-T number of threads per core 00:05:38.263 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:38.263 [-t time in seconds] 00:05:38.263 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:38.263 [ dif_verify, , dif_generate, dif_generate_copy 00:05:38.263 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:38.263 [-l for compress/decompress workloads, name of uncompressed input file 00:05:38.263 [-S for crc32c workload, use this seed value (default 0) 00:05:38.263 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:38.263 [-f for fill workload, use this BYTE value (default 255) 00:05:38.263 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:38.263 [-y verify result if this switch is on] 00:05:38.263 [-a tasks to allocate per core (default: same value as -q)] 00:05:38.263 Can be used to spread operations across a wider range of memory. 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:05:38.263 00:05:38.263 real 0m0.028s 00:05:38.263 user 0m0.020s 00:05:38.263 sys 0m0.008s 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:38.263 04:06:26 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:05:38.263 ************************************ 00:05:38.263 END TEST accel_negative_buffers 00:05:38.263 ************************************ 00:05:38.263 Error: writing output failed: Broken pipe 00:05:38.263 04:06:26 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:38.263 04:06:26 accel -- common/autotest_common.sh@10 -- # set +x 00:05:38.521 ************************************ 00:05:38.521 START TEST accel_crc32c 00:05:38.521 ************************************ 00:05:38.521 04:06:26 accel.accel_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -S 32 -y 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:05:38.521 04:06:26 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:05:38.522 04:06:26 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:05:38.522 [2024-05-15 04:06:26.316160] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:38.522 [2024-05-15 04:06:26.316227] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3800845 ] 00:05:38.522 [2024-05-15 04:06:26.398515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.522 [2024-05-15 04:06:26.515858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:05:38.780 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:38.781 04:06:26 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.153 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.153 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.153 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.153 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.153 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:05:40.154 04:06:27 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:40.154 00:05:40.154 real 0m1.507s 00:05:40.154 user 0m0.009s 00:05:40.154 sys 0m0.001s 00:05:40.154 04:06:27 accel.accel_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:40.154 04:06:27 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:05:40.154 ************************************ 00:05:40.154 END TEST accel_crc32c 00:05:40.154 ************************************ 00:05:40.154 04:06:27 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:05:40.154 04:06:27 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:05:40.154 04:06:27 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:40.154 04:06:27 accel -- common/autotest_common.sh@10 -- # set +x 00:05:40.154 ************************************ 00:05:40.154 START TEST accel_crc32c_C2 00:05:40.154 ************************************ 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -y -C 2 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:05:40.154 04:06:27 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:05:40.154 [2024-05-15 04:06:27.873032] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:40.154 [2024-05-15 04:06:27.873091] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3801113 ] 00:05:40.154 [2024-05-15 04:06:27.951629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.154 [2024-05-15 04:06:28.065594] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:40.154 04:06:28 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.526 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.526 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.526 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.526 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.526 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:41.527 00:05:41.527 real 0m1.498s 00:05:41.527 user 0m0.009s 00:05:41.527 sys 0m0.002s 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:41.527 04:06:29 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:05:41.527 ************************************ 00:05:41.527 END TEST accel_crc32c_C2 00:05:41.527 ************************************ 00:05:41.527 04:06:29 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:05:41.527 04:06:29 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:41.527 04:06:29 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:41.527 04:06:29 accel -- common/autotest_common.sh@10 -- # set +x 00:05:41.527 ************************************ 00:05:41.527 START TEST accel_copy 00:05:41.527 ************************************ 00:05:41.527 04:06:29 accel.accel_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy -y 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:05:41.527 04:06:29 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:05:41.527 [2024-05-15 04:06:29.421733] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:41.527 [2024-05-15 04:06:29.421792] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3801280 ] 00:05:41.527 [2024-05-15 04:06:29.504059] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.785 [2024-05-15 04:06:29.624176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:05:41.785 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:41.786 04:06:29 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:05:43.158 04:06:30 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:43.158 00:05:43.158 real 0m1.515s 00:05:43.158 user 0m0.007s 00:05:43.158 sys 0m0.003s 00:05:43.158 04:06:30 accel.accel_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:43.158 04:06:30 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:05:43.158 ************************************ 00:05:43.158 END TEST accel_copy 00:05:43.158 ************************************ 00:05:43.158 04:06:30 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.158 04:06:30 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:05:43.158 04:06:30 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:43.158 04:06:30 accel -- common/autotest_common.sh@10 -- # set +x 00:05:43.158 ************************************ 00:05:43.158 START TEST accel_fill 00:05:43.158 ************************************ 00:05:43.158 04:06:30 accel.accel_fill -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:05:43.158 04:06:30 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:05:43.158 [2024-05-15 04:06:30.987576] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:43.158 [2024-05-15 04:06:30.987635] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3801492 ] 00:05:43.158 [2024-05-15 04:06:31.069787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.416 [2024-05-15 04:06:31.190446] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:43.416 04:06:31 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:05:44.789 04:06:32 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:44.789 00:05:44.789 real 0m1.516s 00:05:44.789 user 0m0.009s 00:05:44.789 sys 0m0.001s 00:05:44.789 04:06:32 accel.accel_fill -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:44.789 04:06:32 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:05:44.789 ************************************ 00:05:44.789 END TEST accel_fill 00:05:44.789 ************************************ 00:05:44.789 04:06:32 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:05:44.789 04:06:32 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:44.789 04:06:32 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:44.789 04:06:32 accel -- common/autotest_common.sh@10 -- # set +x 00:05:44.789 ************************************ 00:05:44.789 START TEST accel_copy_crc32c 00:05:44.789 ************************************ 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:44.789 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:05:44.790 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:05:44.790 [2024-05-15 04:06:32.552697] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:44.790 [2024-05-15 04:06:32.552757] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3801709 ] 00:05:44.790 [2024-05-15 04:06:32.632808] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.790 [2024-05-15 04:06:32.748488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:45.048 04:06:32 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:46.034 00:05:46.034 real 0m1.501s 00:05:46.034 user 0m0.010s 00:05:46.034 sys 0m0.001s 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:46.034 04:06:34 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:05:46.034 ************************************ 00:05:46.034 END TEST accel_copy_crc32c 00:05:46.034 ************************************ 00:05:46.291 04:06:34 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:05:46.291 04:06:34 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:05:46.291 04:06:34 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:46.291 04:06:34 accel -- common/autotest_common.sh@10 -- # set +x 00:05:46.291 ************************************ 00:05:46.291 START TEST accel_copy_crc32c_C2 00:05:46.291 ************************************ 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:05:46.291 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:05:46.291 [2024-05-15 04:06:34.102777] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:46.291 [2024-05-15 04:06:34.102848] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3801877 ] 00:05:46.292 [2024-05-15 04:06:34.186133] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.292 [2024-05-15 04:06:34.306378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:46.550 04:06:34 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.921 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.921 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:47.922 00:05:47.922 real 0m1.505s 00:05:47.922 user 0m0.010s 00:05:47.922 sys 0m0.003s 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:47.922 04:06:35 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:05:47.922 ************************************ 00:05:47.922 END TEST accel_copy_crc32c_C2 00:05:47.922 ************************************ 00:05:47.922 04:06:35 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:05:47.922 04:06:35 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:47.922 04:06:35 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:47.922 04:06:35 accel -- common/autotest_common.sh@10 -- # set +x 00:05:47.922 ************************************ 00:05:47.922 START TEST accel_dualcast 00:05:47.922 ************************************ 00:05:47.922 04:06:35 accel.accel_dualcast -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dualcast -y 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:05:47.922 04:06:35 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:05:47.922 [2024-05-15 04:06:35.660278] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:47.922 [2024-05-15 04:06:35.660340] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3802143 ] 00:05:47.922 [2024-05-15 04:06:35.741123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.922 [2024-05-15 04:06:35.860633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:48.180 04:06:35 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:05:49.552 04:06:37 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:49.552 00:05:49.552 real 0m1.518s 00:05:49.552 user 0m0.012s 00:05:49.552 sys 0m0.000s 00:05:49.552 04:06:37 accel.accel_dualcast -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:49.552 04:06:37 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:05:49.552 ************************************ 00:05:49.552 END TEST accel_dualcast 00:05:49.552 ************************************ 00:05:49.552 04:06:37 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:05:49.552 04:06:37 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:49.552 04:06:37 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:49.552 04:06:37 accel -- common/autotest_common.sh@10 -- # set +x 00:05:49.552 ************************************ 00:05:49.552 START TEST accel_compare 00:05:49.552 ************************************ 00:05:49.552 04:06:37 accel.accel_compare -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compare -y 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.552 04:06:37 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:05:49.553 [2024-05-15 04:06:37.226365] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:49.553 [2024-05-15 04:06:37.226426] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3802306 ] 00:05:49.553 [2024-05-15 04:06:37.305286] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.553 [2024-05-15 04:06:37.431580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:49.553 04:06:37 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:05:50.924 04:06:38 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:50.924 00:05:50.924 real 0m1.513s 00:05:50.924 user 0m0.010s 00:05:50.924 sys 0m0.002s 00:05:50.924 04:06:38 accel.accel_compare -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:50.924 04:06:38 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:05:50.924 ************************************ 00:05:50.924 END TEST accel_compare 00:05:50.924 ************************************ 00:05:50.924 04:06:38 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:05:50.924 04:06:38 accel -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:05:50.924 04:06:38 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:50.925 04:06:38 accel -- common/autotest_common.sh@10 -- # set +x 00:05:50.925 ************************************ 00:05:50.925 START TEST accel_xor 00:05:50.925 ************************************ 00:05:50.925 04:06:38 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:05:50.925 04:06:38 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:05:50.925 [2024-05-15 04:06:38.785599] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:50.925 [2024-05-15 04:06:38.785652] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3802463 ] 00:05:50.925 [2024-05-15 04:06:38.868714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.183 [2024-05-15 04:06:38.987946] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.183 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:51.184 04:06:39 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:52.556 00:05:52.556 real 0m1.509s 00:05:52.556 user 0m0.010s 00:05:52.556 sys 0m0.003s 00:05:52.556 04:06:40 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:52.556 04:06:40 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:05:52.556 ************************************ 00:05:52.556 END TEST accel_xor 00:05:52.556 ************************************ 00:05:52.556 04:06:40 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:05:52.556 04:06:40 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:05:52.556 04:06:40 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:52.556 04:06:40 accel -- common/autotest_common.sh@10 -- # set +x 00:05:52.556 ************************************ 00:05:52.556 START TEST accel_xor 00:05:52.556 ************************************ 00:05:52.556 04:06:40 accel.accel_xor -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y -x 3 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:52.556 04:06:40 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:52.557 04:06:40 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:52.557 04:06:40 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:52.557 04:06:40 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:05:52.557 04:06:40 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:05:52.557 [2024-05-15 04:06:40.353989] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:52.557 [2024-05-15 04:06:40.354057] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3802737 ] 00:05:52.557 [2024-05-15 04:06:40.435925] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.557 [2024-05-15 04:06:40.553276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:05:52.815 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:52.816 04:06:40 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:05:54.190 04:06:41 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:54.190 00:05:54.190 real 0m1.503s 00:05:54.190 user 0m1.338s 00:05:54.190 sys 0m0.159s 00:05:54.190 04:06:41 accel.accel_xor -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:54.190 04:06:41 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:05:54.190 ************************************ 00:05:54.190 END TEST accel_xor 00:05:54.190 ************************************ 00:05:54.190 04:06:41 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:05:54.190 04:06:41 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:05:54.190 04:06:41 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:54.190 04:06:41 accel -- common/autotest_common.sh@10 -- # set +x 00:05:54.190 ************************************ 00:05:54.190 START TEST accel_dif_verify 00:05:54.190 ************************************ 00:05:54.190 04:06:41 accel.accel_dif_verify -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_verify 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:05:54.190 04:06:41 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:05:54.190 [2024-05-15 04:06:41.900744] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:54.190 [2024-05-15 04:06:41.900822] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3802894 ] 00:05:54.191 [2024-05-15 04:06:41.985318] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.191 [2024-05-15 04:06:42.104748] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:54.191 04:06:42 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.564 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:05:55.565 04:06:43 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:55.565 00:05:55.565 real 0m1.521s 00:05:55.565 user 0m1.352s 00:05:55.565 sys 0m0.163s 00:05:55.565 04:06:43 accel.accel_dif_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:55.565 04:06:43 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:05:55.565 ************************************ 00:05:55.565 END TEST accel_dif_verify 00:05:55.565 ************************************ 00:05:55.565 04:06:43 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:05:55.565 04:06:43 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:05:55.565 04:06:43 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:55.565 04:06:43 accel -- common/autotest_common.sh@10 -- # set +x 00:05:55.565 ************************************ 00:05:55.565 START TEST accel_dif_generate 00:05:55.565 ************************************ 00:05:55.565 04:06:43 accel.accel_dif_generate -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:05:55.565 04:06:43 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:05:55.565 [2024-05-15 04:06:43.474227] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:55.565 [2024-05-15 04:06:43.474282] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3803097 ] 00:05:55.565 [2024-05-15 04:06:43.555515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.823 [2024-05-15 04:06:43.675611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.823 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.823 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.823 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:55.824 04:06:43 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:05:57.197 04:06:44 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:57.197 00:05:57.197 real 0m1.511s 00:05:57.197 user 0m1.333s 00:05:57.197 sys 0m0.174s 00:05:57.197 04:06:44 accel.accel_dif_generate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:57.197 04:06:44 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:05:57.197 ************************************ 00:05:57.197 END TEST accel_dif_generate 00:05:57.197 ************************************ 00:05:57.197 04:06:44 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:05:57.197 04:06:44 accel -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:05:57.197 04:06:44 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:57.197 04:06:44 accel -- common/autotest_common.sh@10 -- # set +x 00:05:57.197 ************************************ 00:05:57.197 START TEST accel_dif_generate_copy 00:05:57.197 ************************************ 00:05:57.197 04:06:45 accel.accel_dif_generate_copy -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate_copy 00:05:57.197 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:05:57.197 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:05:57.197 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.197 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:05:57.198 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:05:57.198 [2024-05-15 04:06:45.036529] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:57.198 [2024-05-15 04:06:45.036590] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3803325 ] 00:05:57.198 [2024-05-15 04:06:45.117870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.456 [2024-05-15 04:06:45.238140] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:57.456 04:06:45 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:58.829 00:05:58.829 real 0m1.499s 00:05:58.829 user 0m0.009s 00:05:58.829 sys 0m0.002s 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:58.829 04:06:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:05:58.829 ************************************ 00:05:58.829 END TEST accel_dif_generate_copy 00:05:58.829 ************************************ 00:05:58.829 04:06:46 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:05:58.829 04:06:46 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:05:58.829 04:06:46 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:05:58.829 04:06:46 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:58.829 04:06:46 accel -- common/autotest_common.sh@10 -- # set +x 00:05:58.829 ************************************ 00:05:58.829 START TEST accel_comp 00:05:58.829 ************************************ 00:05:58.829 04:06:46 accel.accel_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:05:58.829 04:06:46 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.830 04:06:46 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.830 04:06:46 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:05:58.830 04:06:46 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:05:58.830 04:06:46 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:05:58.830 [2024-05-15 04:06:46.585980] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:05:58.830 [2024-05-15 04:06:46.586041] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3803492 ] 00:05:58.830 [2024-05-15 04:06:46.667884] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.830 [2024-05-15 04:06:46.786729] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:05:59.088 04:06:46 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:00.461 04:06:48 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:00.461 00:06:00.461 real 0m1.524s 00:06:00.461 user 0m0.011s 00:06:00.461 sys 0m0.002s 00:06:00.461 04:06:48 accel.accel_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:00.461 04:06:48 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:06:00.461 ************************************ 00:06:00.461 END TEST accel_comp 00:06:00.461 ************************************ 00:06:00.461 04:06:48 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:00.461 04:06:48 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:06:00.461 04:06:48 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:00.461 04:06:48 accel -- common/autotest_common.sh@10 -- # set +x 00:06:00.461 ************************************ 00:06:00.461 START TEST accel_decomp 00:06:00.461 ************************************ 00:06:00.461 04:06:48 accel.accel_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:00.461 [2024-05-15 04:06:48.160195] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:00.461 [2024-05-15 04:06:48.160256] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3803760 ] 00:06:00.461 [2024-05-15 04:06:48.241487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.461 [2024-05-15 04:06:48.361835] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.461 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.462 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:00.462 04:06:48 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:00.462 04:06:48 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:00.462 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:00.462 04:06:48 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:01.836 04:06:49 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:01.836 00:06:01.836 real 0m1.519s 00:06:01.836 user 0m0.008s 00:06:01.836 sys 0m0.005s 00:06:01.836 04:06:49 accel.accel_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:01.836 04:06:49 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:01.836 ************************************ 00:06:01.836 END TEST accel_decomp 00:06:01.836 ************************************ 00:06:01.837 04:06:49 accel -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:01.837 04:06:49 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:01.837 04:06:49 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:01.837 04:06:49 accel -- common/autotest_common.sh@10 -- # set +x 00:06:01.837 ************************************ 00:06:01.837 START TEST accel_decmop_full 00:06:01.837 ************************************ 00:06:01.837 04:06:49 accel.accel_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:06:01.837 04:06:49 accel.accel_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:06:01.837 [2024-05-15 04:06:49.729234] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:01.837 [2024-05-15 04:06:49.729294] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3803917 ] 00:06:01.837 [2024-05-15 04:06:49.810411] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.095 [2024-05-15 04:06:49.928063] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:06:02.095 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=software 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@22 -- # accel_module=software 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=32 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=1 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:02.096 04:06:50 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.468 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.468 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@20 -- # val= 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:03.469 04:06:51 accel.accel_decmop_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:03.469 00:06:03.469 real 0m1.508s 00:06:03.469 user 0m0.014s 00:06:03.469 sys 0m0.002s 00:06:03.469 04:06:51 accel.accel_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:03.469 04:06:51 accel.accel_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:06:03.469 ************************************ 00:06:03.469 END TEST accel_decmop_full 00:06:03.469 ************************************ 00:06:03.469 04:06:51 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:03.469 04:06:51 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:03.469 04:06:51 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:03.469 04:06:51 accel -- common/autotest_common.sh@10 -- # set +x 00:06:03.469 ************************************ 00:06:03.469 START TEST accel_decomp_mcore 00:06:03.469 ************************************ 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:03.469 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:03.469 [2024-05-15 04:06:51.290191] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:03.469 [2024-05-15 04:06:51.290253] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3804080 ] 00:06:03.469 [2024-05-15 04:06:51.372297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:03.727 [2024-05-15 04:06:51.494682] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:03.727 [2024-05-15 04:06:51.494732] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:03.727 [2024-05-15 04:06:51.494850] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:03.727 [2024-05-15 04:06:51.494855] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.727 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:03.728 04:06:51 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:05.101 00:06:05.101 real 0m1.521s 00:06:05.101 user 0m4.824s 00:06:05.101 sys 0m0.163s 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:05.101 04:06:52 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:05.101 ************************************ 00:06:05.101 END TEST accel_decomp_mcore 00:06:05.101 ************************************ 00:06:05.101 04:06:52 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:05.101 04:06:52 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:06:05.101 04:06:52 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:05.101 04:06:52 accel -- common/autotest_common.sh@10 -- # set +x 00:06:05.101 ************************************ 00:06:05.101 START TEST accel_decomp_full_mcore 00:06:05.101 ************************************ 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:05.102 04:06:52 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:05.102 [2024-05-15 04:06:52.865041] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:05.102 [2024-05-15 04:06:52.865102] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3804355 ] 00:06:05.102 [2024-05-15 04:06:52.946069] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:05.102 [2024-05-15 04:06:53.069020] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:05.102 [2024-05-15 04:06:53.069088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:05.102 [2024-05-15 04:06:53.069187] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.102 [2024-05-15 04:06:53.069183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.360 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:05.361 04:06:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.732 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.732 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.732 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:06.733 00:06:06.733 real 0m1.537s 00:06:06.733 user 0m4.871s 00:06:06.733 sys 0m0.181s 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:06.733 04:06:54 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:06.733 ************************************ 00:06:06.733 END TEST accel_decomp_full_mcore 00:06:06.733 ************************************ 00:06:06.733 04:06:54 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:06.733 04:06:54 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:06.733 04:06:54 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:06.733 04:06:54 accel -- common/autotest_common.sh@10 -- # set +x 00:06:06.733 ************************************ 00:06:06.733 START TEST accel_decomp_mthread 00:06:06.733 ************************************ 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:06.733 [2024-05-15 04:06:54.452873] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:06.733 [2024-05-15 04:06:54.452933] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3804519 ] 00:06:06.733 [2024-05-15 04:06:54.536039] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.733 [2024-05-15 04:06:54.651060] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:06.733 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:06.734 04:06:54 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.107 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:08.108 00:06:08.108 real 0m1.512s 00:06:08.108 user 0m1.338s 00:06:08.108 sys 0m0.168s 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:08.108 04:06:55 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:08.108 ************************************ 00:06:08.108 END TEST accel_decomp_mthread 00:06:08.108 ************************************ 00:06:08.108 04:06:55 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:08.108 04:06:55 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:06:08.108 04:06:55 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:08.108 04:06:55 accel -- common/autotest_common.sh@10 -- # set +x 00:06:08.108 ************************************ 00:06:08.108 START TEST accel_decomp_full_mthread 00:06:08.108 ************************************ 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:08.108 04:06:55 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:08.108 [2024-05-15 04:06:56.016004] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:08.108 [2024-05-15 04:06:56.016068] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3804712 ] 00:06:08.108 [2024-05-15 04:06:56.102010] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.367 [2024-05-15 04:06:56.222330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.367 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:08.368 04:06:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:09.742 00:06:09.742 real 0m1.549s 00:06:09.742 user 0m1.369s 00:06:09.742 sys 0m0.175s 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:09.742 04:06:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:09.742 ************************************ 00:06:09.742 END TEST accel_decomp_full_mthread 00:06:09.742 ************************************ 00:06:09.742 04:06:57 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:06:09.742 04:06:57 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:06:09.742 04:06:57 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:06:09.742 04:06:57 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:09.742 04:06:57 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=3804949 00:06:09.742 04:06:57 accel -- accel/accel.sh@63 -- # waitforlisten 3804949 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@827 -- # '[' -z 3804949 ']' 00:06:09.742 04:06:57 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:09.742 04:06:57 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:09.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:09.742 04:06:57 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:09.742 04:06:57 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:09.742 04:06:57 accel -- common/autotest_common.sh@10 -- # set +x 00:06:09.742 04:06:57 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.742 04:06:57 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.742 04:06:57 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:09.742 04:06:57 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:09.742 04:06:57 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:09.742 04:06:57 accel -- accel/accel.sh@41 -- # jq -r . 00:06:09.742 [2024-05-15 04:06:57.628784] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:09.742 [2024-05-15 04:06:57.628901] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3804949 ] 00:06:09.742 [2024-05-15 04:06:57.705818] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.000 [2024-05-15 04:06:57.815782] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.565 [2024-05-15 04:06:58.500186] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:10.823 04:06:58 accel -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:10.823 04:06:58 accel -- common/autotest_common.sh@860 -- # return 0 00:06:10.823 04:06:58 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:10.823 04:06:58 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:10.823 04:06:58 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:10.823 04:06:58 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:06:10.823 04:06:58 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:06:10.823 04:06:58 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:06:10.823 04:06:58 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:06:10.823 04:06:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:10.823 04:06:58 accel -- common/autotest_common.sh@10 -- # set +x 00:06:10.823 04:06:58 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.111 "method": "compressdev_scan_accel_module", 00:06:11.111 04:06:58 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:11.111 04:06:58 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:11.111 04:06:58 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@10 -- # set +x 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # IFS== 00:06:11.111 04:06:58 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:11.111 04:06:58 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:11.111 04:06:58 accel -- accel/accel.sh@75 -- # killprocess 3804949 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@946 -- # '[' -z 3804949 ']' 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@950 -- # kill -0 3804949 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@951 -- # uname 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3804949 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3804949' 00:06:11.111 killing process with pid 3804949 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@965 -- # kill 3804949 00:06:11.111 04:06:58 accel -- common/autotest_common.sh@970 -- # wait 3804949 00:06:11.393 04:06:59 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:11.393 04:06:59 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:11.393 04:06:59 accel -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:06:11.393 04:06:59 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:11.393 04:06:59 accel -- common/autotest_common.sh@10 -- # set +x 00:06:11.651 ************************************ 00:06:11.651 START TEST accel_cdev_comp 00:06:11.651 ************************************ 00:06:11.651 04:06:59 accel.accel_cdev_comp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:11.651 04:06:59 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:06:11.651 [2024-05-15 04:06:59.449538] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:11.651 [2024-05-15 04:06:59.449600] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3805168 ] 00:06:11.651 [2024-05-15 04:06:59.532784] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.651 [2024-05-15 04:06:59.652332] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.584 [2024-05-15 04:07:00.338665] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:12.584 [2024-05-15 04:07:00.341140] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13bb150 PMD being used: compress_qat 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 [2024-05-15 04:07:00.345372] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13bfe50 PMD being used: compress_qat 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:12.584 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:12.585 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:12.585 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:12.585 04:07:00 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:13.958 04:07:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:13.958 00:06:13.958 real 0m2.146s 00:06:13.958 user 0m1.614s 00:06:13.958 sys 0m0.526s 00:06:13.958 04:07:01 accel.accel_cdev_comp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:13.958 04:07:01 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:06:13.958 ************************************ 00:06:13.958 END TEST accel_cdev_comp 00:06:13.958 ************************************ 00:06:13.958 04:07:01 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:13.958 04:07:01 accel -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:06:13.958 04:07:01 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:13.958 04:07:01 accel -- common/autotest_common.sh@10 -- # set +x 00:06:13.958 ************************************ 00:06:13.958 START TEST accel_cdev_decomp 00:06:13.958 ************************************ 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:13.958 04:07:01 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:13.958 [2024-05-15 04:07:01.648214] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:13.958 [2024-05-15 04:07:01.648279] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3805407 ] 00:06:13.958 [2024-05-15 04:07:01.728323] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.958 [2024-05-15 04:07:01.847030] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.523 [2024-05-15 04:07:02.536419] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:14.523 [2024-05-15 04:07:02.538878] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25ee150 PMD being used: compress_qat 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 [2024-05-15 04:07:02.543319] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25f2e50 PMD being used: compress_qat 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:14.782 04:07:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:16.155 00:06:16.155 real 0m2.138s 00:06:16.155 user 0m1.617s 00:06:16.155 sys 0m0.516s 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.155 04:07:03 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:06:16.155 ************************************ 00:06:16.155 END TEST accel_cdev_decomp 00:06:16.155 ************************************ 00:06:16.155 04:07:03 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.155 04:07:03 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:16.155 04:07:03 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:16.155 04:07:03 accel -- common/autotest_common.sh@10 -- # set +x 00:06:16.155 ************************************ 00:06:16.155 START TEST accel_cdev_decmop_full 00:06:16.155 ************************************ 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@16 -- # local accel_opc 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@17 -- # local accel_module 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@12 -- # build_accel_config 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@40 -- # local IFS=, 00:06:16.155 04:07:03 accel.accel_cdev_decmop_full -- accel/accel.sh@41 -- # jq -r . 00:06:16.155 [2024-05-15 04:07:03.839190] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:16.155 [2024-05-15 04:07:03.839254] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3805686 ] 00:06:16.155 [2024-05-15 04:07:03.925148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.155 [2024-05-15 04:07:04.046026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.088 [2024-05-15 04:07:04.738487] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:17.088 [2024-05-15 04:07:04.740925] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x252c150 PMD being used: compress_qat 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=0x1 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 [2024-05-15 04:07:04.744384] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x252f470 PMD being used: compress_qat 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=decompress 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=32 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=1 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val='1 seconds' 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val=Yes 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:17.088 04:07:04 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@20 -- # val= 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@21 -- # case "$var" in 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:18.019 00:06:18.019 real 0m2.153s 00:06:18.019 user 0m1.625s 00:06:18.019 sys 0m0.509s 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:18.019 04:07:05 accel.accel_cdev_decmop_full -- common/autotest_common.sh@10 -- # set +x 00:06:18.019 ************************************ 00:06:18.019 END TEST accel_cdev_decmop_full 00:06:18.019 ************************************ 00:06:18.019 04:07:05 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.019 04:07:05 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:18.019 04:07:05 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:18.019 04:07:05 accel -- common/autotest_common.sh@10 -- # set +x 00:06:18.019 ************************************ 00:06:18.019 START TEST accel_cdev_decomp_mcore 00:06:18.019 ************************************ 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:18.019 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:18.277 [2024-05-15 04:07:06.044881] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:18.277 [2024-05-15 04:07:06.044936] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3805970 ] 00:06:18.277 [2024-05-15 04:07:06.131493] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:18.277 [2024-05-15 04:07:06.254360] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.277 [2024-05-15 04:07:06.254431] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:18.277 [2024-05-15 04:07:06.254484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:18.277 [2024-05-15 04:07:06.254488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.208 [2024-05-15 04:07:06.861134] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:19.208 [2024-05-15 04:07:06.863362] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x219c7a0 PMD being used: compress_qat 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 [2024-05-15 04:07:06.869088] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f98c019b890 PMD being used: compress_qat 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 [2024-05-15 04:07:06.870426] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f98b819b890 PMD being used: compress_qat 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:06:19.208 [2024-05-15 04:07:06.871109] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21a1cc0 PMD being used: compress_qat 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 [2024-05-15 04:07:06.871217] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f98b019b890 PMD being used: compress_qat 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:19.208 04:07:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:20.138 00:06:20.138 real 0m2.090s 00:06:20.138 user 0m6.749s 00:06:20.138 sys 0m0.495s 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:20.138 04:07:08 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:20.138 ************************************ 00:06:20.138 END TEST accel_cdev_decomp_mcore 00:06:20.138 ************************************ 00:06:20.138 04:07:08 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:20.138 04:07:08 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:06:20.138 04:07:08 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:20.138 04:07:08 accel -- common/autotest_common.sh@10 -- # set +x 00:06:20.395 ************************************ 00:06:20.395 START TEST accel_cdev_decomp_full_mcore 00:06:20.395 ************************************ 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:06:20.395 04:07:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:06:20.395 [2024-05-15 04:07:08.185816] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:20.395 [2024-05-15 04:07:08.185901] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3806252 ] 00:06:20.395 [2024-05-15 04:07:08.271861] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:20.395 [2024-05-15 04:07:08.395428] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.395 [2024-05-15 04:07:08.395493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:20.395 [2024-05-15 04:07:08.395580] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:20.395 [2024-05-15 04:07:08.395582] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.326 [2024-05-15 04:07:09.011969] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:21.326 [2024-05-15 04:07:09.014231] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe2c7a0 PMD being used: compress_qat 00:06:21.326 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 [2024-05-15 04:07:09.019090] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fac3c19b890 PMD being used: compress_qat 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:21.327 [2024-05-15 04:07:09.020409] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fac3419b890 PMD being used: compress_qat 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:06:21.327 [2024-05-15 04:07:09.021065] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe2c840 PMD being used: compress_qat 00:06:21.327 [2024-05-15 04:07:09.021169] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fac2c19b890 PMD being used: compress_qat 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:21.327 04:07:09 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:22.269 00:06:22.269 real 0m2.106s 00:06:22.269 user 0m6.786s 00:06:22.269 sys 0m0.503s 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:22.269 04:07:10 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:06:22.269 ************************************ 00:06:22.269 END TEST accel_cdev_decomp_full_mcore 00:06:22.269 ************************************ 00:06:22.532 04:07:10 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:22.532 04:07:10 accel -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:06:22.532 04:07:10 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:22.532 04:07:10 accel -- common/autotest_common.sh@10 -- # set +x 00:06:22.532 ************************************ 00:06:22.532 START TEST accel_cdev_decomp_mthread 00:06:22.532 ************************************ 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:22.532 04:07:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:22.532 [2024-05-15 04:07:10.344264] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:22.532 [2024-05-15 04:07:10.344319] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3806537 ] 00:06:22.532 [2024-05-15 04:07:10.428769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.790 [2024-05-15 04:07:10.547583] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.356 [2024-05-15 04:07:11.214203] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:23.356 [2024-05-15 04:07:11.216666] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x111d150 PMD being used: compress_qat 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:23.356 [2024-05-15 04:07:11.222021] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11221e0 PMD being used: compress_qat 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.356 [2024-05-15 04:07:11.224340] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1244fe0 PMD being used: compress_qat 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.356 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:23.357 04:07:11 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:24.730 00:06:24.730 real 0m2.138s 00:06:24.730 user 0m1.622s 00:06:24.730 sys 0m0.509s 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:24.730 04:07:12 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:24.730 ************************************ 00:06:24.730 END TEST accel_cdev_decomp_mthread 00:06:24.730 ************************************ 00:06:24.730 04:07:12 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:24.730 04:07:12 accel -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:06:24.730 04:07:12 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:24.730 04:07:12 accel -- common/autotest_common.sh@10 -- # set +x 00:06:24.730 ************************************ 00:06:24.730 START TEST accel_cdev_decomp_full_mthread 00:06:24.730 ************************************ 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:06:24.730 04:07:12 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:06:24.730 [2024-05-15 04:07:12.538138] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:24.730 [2024-05-15 04:07:12.538205] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3806823 ] 00:06:24.730 [2024-05-15 04:07:12.625528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.988 [2024-05-15 04:07:12.749176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.553 [2024-05-15 04:07:13.442266] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:06:25.553 [2024-05-15 04:07:13.444730] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2360150 PMD being used: compress_qat 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:06:25.553 [2024-05-15 04:07:13.449287] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2363470 PMD being used: compress_qat 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:25.553 [2024-05-15 04:07:13.451913] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2487c40 PMD being used: compress_qat 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.553 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:25.554 04:07:13 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:06:26.924 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:06:26.925 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:06:26.925 04:07:14 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:06:26.925 00:06:26.925 real 0m2.163s 00:06:26.925 user 0m0.011s 00:06:26.925 sys 0m0.003s 00:06:26.925 04:07:14 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:26.925 04:07:14 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:06:26.925 ************************************ 00:06:26.925 END TEST accel_cdev_decomp_full_mthread 00:06:26.925 ************************************ 00:06:26.925 04:07:14 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:06:26.925 04:07:14 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:26.925 04:07:14 accel -- accel/accel.sh@137 -- # build_accel_config 00:06:26.925 04:07:14 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:26.925 04:07:14 accel -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:26.925 04:07:14 accel -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:26.925 04:07:14 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:26.925 04:07:14 accel -- common/autotest_common.sh@10 -- # set +x 00:06:26.925 04:07:14 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.925 04:07:14 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.925 04:07:14 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:26.925 04:07:14 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:26.925 04:07:14 accel -- accel/accel.sh@41 -- # jq -r . 00:06:26.925 ************************************ 00:06:26.925 START TEST accel_dif_functional_tests 00:06:26.925 ************************************ 00:06:26.925 04:07:14 accel.accel_dif_functional_tests -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:26.925 [2024-05-15 04:07:14.773052] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:26.925 [2024-05-15 04:07:14.773126] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3807111 ] 00:06:26.925 [2024-05-15 04:07:14.853544] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:27.183 [2024-05-15 04:07:14.980752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.183 [2024-05-15 04:07:14.980801] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:27.183 [2024-05-15 04:07:14.980804] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:27.183 00:06:27.183 00:06:27.183 CUnit - A unit testing framework for C - Version 2.1-3 00:06:27.183 http://cunit.sourceforge.net/ 00:06:27.183 00:06:27.183 00:06:27.183 Suite: accel_dif 00:06:27.183 Test: verify: DIF generated, GUARD check ...passed 00:06:27.183 Test: verify: DIF generated, APPTAG check ...passed 00:06:27.183 Test: verify: DIF generated, REFTAG check ...passed 00:06:27.183 Test: verify: DIF not generated, GUARD check ...[2024-05-15 04:07:15.099704] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:27.183 [2024-05-15 04:07:15.099768] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:27.183 passed 00:06:27.183 Test: verify: DIF not generated, APPTAG check ...[2024-05-15 04:07:15.099815] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:27.183 [2024-05-15 04:07:15.099855] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:27.183 passed 00:06:27.183 Test: verify: DIF not generated, REFTAG check ...[2024-05-15 04:07:15.099892] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:27.183 [2024-05-15 04:07:15.099924] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:27.183 passed 00:06:27.183 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:27.183 Test: verify: APPTAG incorrect, APPTAG check ...[2024-05-15 04:07:15.099995] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:27.183 passed 00:06:27.183 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:27.183 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:27.183 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:27.183 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-05-15 04:07:15.100155] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:27.183 passed 00:06:27.183 Test: generate copy: DIF generated, GUARD check ...passed 00:06:27.183 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:27.183 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:27.183 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:27.183 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:27.183 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:27.183 Test: generate copy: iovecs-len validate ...[2024-05-15 04:07:15.100412] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:27.183 passed 00:06:27.183 Test: generate copy: buffer alignment validate ...passed 00:06:27.183 00:06:27.183 Run Summary: Type Total Ran Passed Failed Inactive 00:06:27.183 suites 1 1 n/a 0 0 00:06:27.183 tests 20 20 20 0 0 00:06:27.183 asserts 204 204 204 0 n/a 00:06:27.183 00:06:27.183 Elapsed time = 0.003 seconds 00:06:27.441 00:06:27.441 real 0m0.640s 00:06:27.441 user 0m0.956s 00:06:27.441 sys 0m0.195s 00:06:27.441 04:07:15 accel.accel_dif_functional_tests -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:27.441 04:07:15 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:06:27.441 ************************************ 00:06:27.441 END TEST accel_dif_functional_tests 00:06:27.441 ************************************ 00:06:27.441 00:06:27.441 real 0m52.045s 00:06:27.441 user 1m1.257s 00:06:27.441 sys 0m9.725s 00:06:27.441 04:07:15 accel -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:27.441 04:07:15 accel -- common/autotest_common.sh@10 -- # set +x 00:06:27.441 ************************************ 00:06:27.441 END TEST accel 00:06:27.441 ************************************ 00:06:27.441 04:07:15 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:27.441 04:07:15 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:27.441 04:07:15 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:27.441 04:07:15 -- common/autotest_common.sh@10 -- # set +x 00:06:27.441 ************************************ 00:06:27.441 START TEST accel_rpc 00:06:27.441 ************************************ 00:06:27.441 04:07:15 accel_rpc -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:27.699 * Looking for test storage... 00:06:27.699 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:27.699 04:07:15 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:27.699 04:07:15 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3807294 00:06:27.699 04:07:15 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:27.699 04:07:15 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 3807294 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@827 -- # '[' -z 3807294 ']' 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.699 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:27.699 04:07:15 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.699 [2024-05-15 04:07:15.553268] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:27.699 [2024-05-15 04:07:15.553358] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3807294 ] 00:06:27.699 [2024-05-15 04:07:15.630947] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.957 [2024-05-15 04:07:15.739282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.523 04:07:16 accel_rpc -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:28.523 04:07:16 accel_rpc -- common/autotest_common.sh@860 -- # return 0 00:06:28.523 04:07:16 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:28.523 04:07:16 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:28.523 04:07:16 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:28.523 04:07:16 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:28.523 04:07:16 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:28.523 04:07:16 accel_rpc -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:28.523 04:07:16 accel_rpc -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:28.523 04:07:16 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:28.523 ************************************ 00:06:28.523 START TEST accel_assign_opcode 00:06:28.523 ************************************ 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1121 -- # accel_assign_opcode_test_suite 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:06:28.523 [2024-05-15 04:07:16.497677] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:06:28.523 [2024-05-15 04:07:16.505681] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.523 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:06:28.781 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:29.038 software 00:06:29.038 00:06:29.038 real 0m0.318s 00:06:29.038 user 0m0.036s 00:06:29.038 sys 0m0.009s 00:06:29.038 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:29.038 04:07:16 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:06:29.038 ************************************ 00:06:29.038 END TEST accel_assign_opcode 00:06:29.038 ************************************ 00:06:29.038 04:07:16 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 3807294 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@946 -- # '[' -z 3807294 ']' 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@950 -- # kill -0 3807294 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@951 -- # uname 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3807294 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3807294' 00:06:29.038 killing process with pid 3807294 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@965 -- # kill 3807294 00:06:29.038 04:07:16 accel_rpc -- common/autotest_common.sh@970 -- # wait 3807294 00:06:29.604 00:06:29.604 real 0m1.878s 00:06:29.604 user 0m1.956s 00:06:29.604 sys 0m0.481s 00:06:29.604 04:07:17 accel_rpc -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:29.604 04:07:17 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:29.604 ************************************ 00:06:29.604 END TEST accel_rpc 00:06:29.604 ************************************ 00:06:29.604 04:07:17 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:06:29.604 04:07:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:29.604 04:07:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:29.604 04:07:17 -- common/autotest_common.sh@10 -- # set +x 00:06:29.604 ************************************ 00:06:29.604 START TEST app_cmdline 00:06:29.604 ************************************ 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:06:29.604 * Looking for test storage... 00:06:29.604 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:06:29.604 04:07:17 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:29.604 04:07:17 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=3807515 00:06:29.604 04:07:17 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:29.604 04:07:17 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 3807515 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@827 -- # '[' -z 3807515 ']' 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:29.604 04:07:17 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:29.604 [2024-05-15 04:07:17.488259] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:29.604 [2024-05-15 04:07:17.488332] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3807515 ] 00:06:29.604 [2024-05-15 04:07:17.568520] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.862 [2024-05-15 04:07:17.677460] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.796 04:07:18 app_cmdline -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:30.796 04:07:18 app_cmdline -- common/autotest_common.sh@860 -- # return 0 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:30.796 { 00:06:30.796 "version": "SPDK v24.05-pre git sha1 2dc74a001", 00:06:30.796 "fields": { 00:06:30.796 "major": 24, 00:06:30.796 "minor": 5, 00:06:30.796 "patch": 0, 00:06:30.796 "suffix": "-pre", 00:06:30.796 "commit": "2dc74a001" 00:06:30.796 } 00:06:30.796 } 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:30.796 04:07:18 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:30.797 04:07:18 app_cmdline -- app/cmdline.sh@26 -- # sort 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:30.797 04:07:18 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:30.797 04:07:18 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:30.797 04:07:18 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:06:30.797 04:07:18 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:31.055 request: 00:06:31.055 { 00:06:31.055 "method": "env_dpdk_get_mem_stats", 00:06:31.055 "req_id": 1 00:06:31.055 } 00:06:31.055 Got JSON-RPC error response 00:06:31.055 response: 00:06:31.055 { 00:06:31.055 "code": -32601, 00:06:31.055 "message": "Method not found" 00:06:31.055 } 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:31.055 04:07:19 app_cmdline -- app/cmdline.sh@1 -- # killprocess 3807515 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@946 -- # '[' -z 3807515 ']' 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@950 -- # kill -0 3807515 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@951 -- # uname 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3807515 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3807515' 00:06:31.055 killing process with pid 3807515 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@965 -- # kill 3807515 00:06:31.055 04:07:19 app_cmdline -- common/autotest_common.sh@970 -- # wait 3807515 00:06:31.620 00:06:31.620 real 0m2.148s 00:06:31.620 user 0m2.671s 00:06:31.620 sys 0m0.537s 00:06:31.620 04:07:19 app_cmdline -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:31.620 04:07:19 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:31.620 ************************************ 00:06:31.620 END TEST app_cmdline 00:06:31.620 ************************************ 00:06:31.620 04:07:19 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:06:31.620 04:07:19 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:31.620 04:07:19 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:31.620 04:07:19 -- common/autotest_common.sh@10 -- # set +x 00:06:31.620 ************************************ 00:06:31.620 START TEST version 00:06:31.620 ************************************ 00:06:31.620 04:07:19 version -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:06:31.620 * Looking for test storage... 00:06:31.620 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:06:31.620 04:07:19 version -- app/version.sh@17 -- # get_header_version major 00:06:31.620 04:07:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:06:31.620 04:07:19 version -- app/version.sh@14 -- # cut -f2 00:06:31.620 04:07:19 version -- app/version.sh@14 -- # tr -d '"' 00:06:31.879 04:07:19 version -- app/version.sh@17 -- # major=24 00:06:31.879 04:07:19 version -- app/version.sh@18 -- # get_header_version minor 00:06:31.879 04:07:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # cut -f2 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # tr -d '"' 00:06:31.879 04:07:19 version -- app/version.sh@18 -- # minor=5 00:06:31.879 04:07:19 version -- app/version.sh@19 -- # get_header_version patch 00:06:31.879 04:07:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # cut -f2 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # tr -d '"' 00:06:31.879 04:07:19 version -- app/version.sh@19 -- # patch=0 00:06:31.879 04:07:19 version -- app/version.sh@20 -- # get_header_version suffix 00:06:31.879 04:07:19 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # cut -f2 00:06:31.879 04:07:19 version -- app/version.sh@14 -- # tr -d '"' 00:06:31.879 04:07:19 version -- app/version.sh@20 -- # suffix=-pre 00:06:31.879 04:07:19 version -- app/version.sh@22 -- # version=24.5 00:06:31.879 04:07:19 version -- app/version.sh@25 -- # (( patch != 0 )) 00:06:31.879 04:07:19 version -- app/version.sh@28 -- # version=24.5rc0 00:06:31.879 04:07:19 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:06:31.879 04:07:19 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:31.879 04:07:19 version -- app/version.sh@30 -- # py_version=24.5rc0 00:06:31.879 04:07:19 version -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:06:31.879 00:06:31.879 real 0m0.107s 00:06:31.879 user 0m0.061s 00:06:31.879 sys 0m0.067s 00:06:31.879 04:07:19 version -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:31.879 04:07:19 version -- common/autotest_common.sh@10 -- # set +x 00:06:31.879 ************************************ 00:06:31.879 END TEST version 00:06:31.879 ************************************ 00:06:31.879 04:07:19 -- spdk/autotest.sh@184 -- # '[' 1 -eq 1 ']' 00:06:31.879 04:07:19 -- spdk/autotest.sh@185 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:06:31.879 04:07:19 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:31.879 04:07:19 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:31.879 04:07:19 -- common/autotest_common.sh@10 -- # set +x 00:06:31.879 ************************************ 00:06:31.879 START TEST blockdev_general 00:06:31.879 ************************************ 00:06:31.879 04:07:19 blockdev_general -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:06:31.879 * Looking for test storage... 00:06:31.879 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:31.879 04:07:19 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:06:31.879 04:07:19 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3807934 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:06:31.880 04:07:19 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 3807934 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@827 -- # '[' -z 3807934 ']' 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.880 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:31.880 04:07:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:31.880 [2024-05-15 04:07:19.854645] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:31.880 [2024-05-15 04:07:19.854732] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3807934 ] 00:06:32.138 [2024-05-15 04:07:19.940587] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.138 [2024-05-15 04:07:20.068575] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.071 04:07:20 blockdev_general -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:33.071 04:07:20 blockdev_general -- common/autotest_common.sh@860 -- # return 0 00:06:33.071 04:07:20 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:06:33.071 04:07:20 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:06:33.071 04:07:20 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:06:33.071 04:07:20 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.071 04:07:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.329 [2024-05-15 04:07:21.112811] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:33.329 [2024-05-15 04:07:21.112893] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:33.329 00:06:33.329 [2024-05-15 04:07:21.120784] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:33.329 [2024-05-15 04:07:21.120818] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:33.329 00:06:33.329 Malloc0 00:06:33.329 Malloc1 00:06:33.329 Malloc2 00:06:33.329 Malloc3 00:06:33.329 Malloc4 00:06:33.329 Malloc5 00:06:33.329 Malloc6 00:06:33.329 Malloc7 00:06:33.329 Malloc8 00:06:33.329 Malloc9 00:06:33.329 [2024-05-15 04:07:21.290459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:33.329 [2024-05-15 04:07:21.290521] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:33.329 [2024-05-15 04:07:21.290548] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25f3230 00:06:33.329 [2024-05-15 04:07:21.290564] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:33.329 [2024-05-15 04:07:21.291940] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:33.329 [2024-05-15 04:07:21.291970] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:06:33.329 TestPT 00:06:33.329 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.329 04:07:21 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:06:33.589 5000+0 records in 00:06:33.589 5000+0 records out 00:06:33.589 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0133578 s, 767 MB/s 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 AIO0 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:06:33.589 04:07:21 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:06:33.589 04:07:21 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:06:33.590 04:07:21 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b56347e6-134d-4bbb-95f6-c7f7b6df45d6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b56347e6-134d-4bbb-95f6-c7f7b6df45d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2dcc76db-36fd-5b20-bed9-abc082d8df21"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2dcc76db-36fd-5b20-bed9-abc082d8df21",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "20ff41aa-41e7-5a1c-b479-82564ee83fa2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20ff41aa-41e7-5a1c-b479-82564ee83fa2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "608c2ea9-0f8f-5b45-ba8e-98352992ad99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "608c2ea9-0f8f-5b45-ba8e-98352992ad99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "73876cfc-f9ee-5012-922c-48fc719deaa0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "73876cfc-f9ee-5012-922c-48fc719deaa0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "84b891af-26e3-565f-ac29-50ceee22b12f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "84b891af-26e3-565f-ac29-50ceee22b12f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5ca11368-9f6e-539a-aa6a-ff35eec02037"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5ca11368-9f6e-539a-aa6a-ff35eec02037",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "c747978b-fd45-5874-9e07-ad8241c954e1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c747978b-fd45-5874-9e07-ad8241c954e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1d666b01-c929-5dfe-999d-30a3e00c7a7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1d666b01-c929-5dfe-999d-30a3e00c7a7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "30ee2496-49d8-5b22-a283-c51bc6755f3a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30ee2496-49d8-5b22-a283-c51bc6755f3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f632768c-1cca-547e-bcc0-b37b5e9cb1c5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f632768c-1cca-547e-bcc0-b37b5e9cb1c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "735f4fb4-44c7-566c-9de4-c5a40e5c79c5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "735f4fb4-44c7-566c-9de4-c5a40e5c79c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "13ffda48-aefe-46b3-87f0-810b433ecc0a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "916715ef-e5e6-49fc-bae4-da39ea36f454",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "3504f656-ff27-4692-9f43-42e77fdd1998",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "c0ab1585-622f-47a6-b356-eacf0a81ff84"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "a14bb9d4-085b-4d18-add2-a8f2cef9d25e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "dec0015f-a1af-4b8f-9a2a-62fc58c0abd1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "68628006-718a-494c-a1ef-c0c7e9bd1a8f"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "311a4465-bc7a-4fea-aa4a-77aa49fd36d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e40748dc-5932-4de0-8806-ae3983b8441c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f47942cf-9af6-4096-aa3f-a4afd351e1fc"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f47942cf-9af6-4096-aa3f-a4afd351e1fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:06:33.590 04:07:21 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:06:33.590 04:07:21 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:06:33.590 04:07:21 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:06:33.590 04:07:21 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 3807934 00:06:33.590 04:07:21 blockdev_general -- common/autotest_common.sh@946 -- # '[' -z 3807934 ']' 00:06:33.590 04:07:21 blockdev_general -- common/autotest_common.sh@950 -- # kill -0 3807934 00:06:33.590 04:07:21 blockdev_general -- common/autotest_common.sh@951 -- # uname 00:06:33.590 04:07:21 blockdev_general -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:33.590 04:07:21 blockdev_general -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3807934 00:06:33.848 04:07:21 blockdev_general -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:33.848 04:07:21 blockdev_general -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:33.848 04:07:21 blockdev_general -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3807934' 00:06:33.848 killing process with pid 3807934 00:06:33.848 04:07:21 blockdev_general -- common/autotest_common.sh@965 -- # kill 3807934 00:06:33.848 04:07:21 blockdev_general -- common/autotest_common.sh@970 -- # wait 3807934 00:06:34.440 04:07:22 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:06:34.440 04:07:22 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:06:34.440 04:07:22 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:06:34.440 04:07:22 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:34.440 04:07:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:34.440 ************************************ 00:06:34.440 START TEST bdev_hello_world 00:06:34.440 ************************************ 00:06:34.440 04:07:22 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:06:34.440 [2024-05-15 04:07:22.323965] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:34.440 [2024-05-15 04:07:22.324036] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3808314 ] 00:06:34.440 [2024-05-15 04:07:22.405649] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.704 [2024-05-15 04:07:22.530645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.704 [2024-05-15 04:07:22.709866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:34.704 [2024-05-15 04:07:22.709953] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:34.704 [2024-05-15 04:07:22.709973] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:34.704 [2024-05-15 04:07:22.717863] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:34.704 [2024-05-15 04:07:22.717898] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:34.962 [2024-05-15 04:07:22.725893] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:34.962 [2024-05-15 04:07:22.725928] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:34.962 [2024-05-15 04:07:22.810688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:34.962 [2024-05-15 04:07:22.810769] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:34.962 [2024-05-15 04:07:22.810795] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10eac20 00:06:34.962 [2024-05-15 04:07:22.810811] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:34.962 [2024-05-15 04:07:22.812673] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:34.962 [2024-05-15 04:07:22.812704] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:06:34.962 [2024-05-15 04:07:22.969147] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:06:34.962 [2024-05-15 04:07:22.969202] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:06:34.962 [2024-05-15 04:07:22.969232] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:06:34.962 [2024-05-15 04:07:22.969273] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:06:34.962 [2024-05-15 04:07:22.969316] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:06:34.962 [2024-05-15 04:07:22.969337] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:06:34.962 [2024-05-15 04:07:22.969373] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:06:34.962 00:06:34.962 [2024-05-15 04:07:22.969400] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:06:35.525 00:06:35.525 real 0m1.095s 00:06:35.525 user 0m0.742s 00:06:35.525 sys 0m0.306s 00:06:35.525 04:07:23 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:35.525 04:07:23 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:06:35.525 ************************************ 00:06:35.525 END TEST bdev_hello_world 00:06:35.525 ************************************ 00:06:35.525 04:07:23 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:06:35.525 04:07:23 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:06:35.525 04:07:23 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:35.525 04:07:23 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:35.525 ************************************ 00:06:35.525 START TEST bdev_bounds 00:06:35.525 ************************************ 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3808468 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3808468' 00:06:35.525 Process bdevio pid: 3808468 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3808468 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 3808468 ']' 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:35.525 04:07:23 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:35.525 [2024-05-15 04:07:23.478412] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:35.525 [2024-05-15 04:07:23.478500] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3808468 ] 00:06:35.783 [2024-05-15 04:07:23.561571] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:35.783 [2024-05-15 04:07:23.684927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:35.783 [2024-05-15 04:07:23.684982] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:35.783 [2024-05-15 04:07:23.684985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.041 [2024-05-15 04:07:23.858214] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:36.041 [2024-05-15 04:07:23.858304] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:36.041 [2024-05-15 04:07:23.858324] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:36.041 [2024-05-15 04:07:23.866216] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:36.041 [2024-05-15 04:07:23.866251] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:36.041 [2024-05-15 04:07:23.874227] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:36.041 [2024-05-15 04:07:23.874261] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:36.041 [2024-05-15 04:07:23.959581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:36.041 [2024-05-15 04:07:23.959668] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:36.041 [2024-05-15 04:07:23.959694] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b11150 00:06:36.041 [2024-05-15 04:07:23.959710] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:36.041 [2024-05-15 04:07:23.961495] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:36.041 [2024-05-15 04:07:23.961526] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:06:36.608 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:36.608 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:06:36.608 04:07:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:06:36.608 I/O targets: 00:06:36.608 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:06:36.608 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:06:36.608 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:06:36.608 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:06:36.608 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:06:36.608 raid0: 131072 blocks of 512 bytes (64 MiB) 00:06:36.608 concat0: 131072 blocks of 512 bytes (64 MiB) 00:06:36.608 raid1: 65536 blocks of 512 bytes (32 MiB) 00:06:36.608 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:06:36.608 00:06:36.608 00:06:36.608 CUnit - A unit testing framework for C - Version 2.1-3 00:06:36.608 http://cunit.sourceforge.net/ 00:06:36.608 00:06:36.608 00:06:36.608 Suite: bdevio tests on: AIO0 00:06:36.608 Test: blockdev write read block ...passed 00:06:36.608 Test: blockdev write zeroes read block ...passed 00:06:36.608 Test: blockdev write zeroes read no split ...passed 00:06:36.608 Test: blockdev write zeroes read split ...passed 00:06:36.608 Test: blockdev write zeroes read split partial ...passed 00:06:36.608 Test: blockdev reset ...passed 00:06:36.608 Test: blockdev write read 8 blocks ...passed 00:06:36.608 Test: blockdev write read size > 128k ...passed 00:06:36.608 Test: blockdev write read invalid size ...passed 00:06:36.608 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.608 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.608 Test: blockdev write read max offset ...passed 00:06:36.608 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.608 Test: blockdev writev readv 8 blocks ...passed 00:06:36.608 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.608 Test: blockdev writev readv block ...passed 00:06:36.608 Test: blockdev writev readv size > 128k ...passed 00:06:36.608 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.608 Test: blockdev comparev and writev ...passed 00:06:36.608 Test: blockdev nvme passthru rw ...passed 00:06:36.608 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.608 Test: blockdev nvme admin passthru ...passed 00:06:36.608 Test: blockdev copy ...passed 00:06:36.608 Suite: bdevio tests on: raid1 00:06:36.608 Test: blockdev write read block ...passed 00:06:36.608 Test: blockdev write zeroes read block ...passed 00:06:36.608 Test: blockdev write zeroes read no split ...passed 00:06:36.608 Test: blockdev write zeroes read split ...passed 00:06:36.608 Test: blockdev write zeroes read split partial ...passed 00:06:36.608 Test: blockdev reset ...passed 00:06:36.608 Test: blockdev write read 8 blocks ...passed 00:06:36.608 Test: blockdev write read size > 128k ...passed 00:06:36.608 Test: blockdev write read invalid size ...passed 00:06:36.608 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.608 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.608 Test: blockdev write read max offset ...passed 00:06:36.608 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.608 Test: blockdev writev readv 8 blocks ...passed 00:06:36.608 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.608 Test: blockdev writev readv block ...passed 00:06:36.608 Test: blockdev writev readv size > 128k ...passed 00:06:36.608 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.608 Test: blockdev comparev and writev ...passed 00:06:36.608 Test: blockdev nvme passthru rw ...passed 00:06:36.608 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.608 Test: blockdev nvme admin passthru ...passed 00:06:36.608 Test: blockdev copy ...passed 00:06:36.608 Suite: bdevio tests on: concat0 00:06:36.608 Test: blockdev write read block ...passed 00:06:36.608 Test: blockdev write zeroes read block ...passed 00:06:36.608 Test: blockdev write zeroes read no split ...passed 00:06:36.608 Test: blockdev write zeroes read split ...passed 00:06:36.608 Test: blockdev write zeroes read split partial ...passed 00:06:36.608 Test: blockdev reset ...passed 00:06:36.608 Test: blockdev write read 8 blocks ...passed 00:06:36.608 Test: blockdev write read size > 128k ...passed 00:06:36.608 Test: blockdev write read invalid size ...passed 00:06:36.608 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.608 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.608 Test: blockdev write read max offset ...passed 00:06:36.608 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.608 Test: blockdev writev readv 8 blocks ...passed 00:06:36.608 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.608 Test: blockdev writev readv block ...passed 00:06:36.608 Test: blockdev writev readv size > 128k ...passed 00:06:36.608 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.608 Test: blockdev comparev and writev ...passed 00:06:36.608 Test: blockdev nvme passthru rw ...passed 00:06:36.608 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.608 Test: blockdev nvme admin passthru ...passed 00:06:36.608 Test: blockdev copy ...passed 00:06:36.608 Suite: bdevio tests on: raid0 00:06:36.608 Test: blockdev write read block ...passed 00:06:36.608 Test: blockdev write zeroes read block ...passed 00:06:36.608 Test: blockdev write zeroes read no split ...passed 00:06:36.608 Test: blockdev write zeroes read split ...passed 00:06:36.608 Test: blockdev write zeroes read split partial ...passed 00:06:36.608 Test: blockdev reset ...passed 00:06:36.608 Test: blockdev write read 8 blocks ...passed 00:06:36.608 Test: blockdev write read size > 128k ...passed 00:06:36.609 Test: blockdev write read invalid size ...passed 00:06:36.609 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.609 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.609 Test: blockdev write read max offset ...passed 00:06:36.609 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.609 Test: blockdev writev readv 8 blocks ...passed 00:06:36.609 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.609 Test: blockdev writev readv block ...passed 00:06:36.609 Test: blockdev writev readv size > 128k ...passed 00:06:36.609 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.609 Test: blockdev comparev and writev ...passed 00:06:36.609 Test: blockdev nvme passthru rw ...passed 00:06:36.609 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.609 Test: blockdev nvme admin passthru ...passed 00:06:36.609 Test: blockdev copy ...passed 00:06:36.609 Suite: bdevio tests on: TestPT 00:06:36.609 Test: blockdev write read block ...passed 00:06:36.609 Test: blockdev write zeroes read block ...passed 00:06:36.609 Test: blockdev write zeroes read no split ...passed 00:06:36.609 Test: blockdev write zeroes read split ...passed 00:06:36.609 Test: blockdev write zeroes read split partial ...passed 00:06:36.609 Test: blockdev reset ...passed 00:06:36.867 Test: blockdev write read 8 blocks ...passed 00:06:36.867 Test: blockdev write read size > 128k ...passed 00:06:36.867 Test: blockdev write read invalid size ...passed 00:06:36.867 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.867 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.867 Test: blockdev write read max offset ...passed 00:06:36.867 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.867 Test: blockdev writev readv 8 blocks ...passed 00:06:36.867 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.867 Test: blockdev writev readv block ...passed 00:06:36.867 Test: blockdev writev readv size > 128k ...passed 00:06:36.867 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.867 Test: blockdev comparev and writev ...passed 00:06:36.867 Test: blockdev nvme passthru rw ...passed 00:06:36.867 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.867 Test: blockdev nvme admin passthru ...passed 00:06:36.867 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p7 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p6 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p5 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p4 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p3 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p2 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p1 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.868 Test: blockdev writev readv 8 blocks ...passed 00:06:36.868 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.868 Test: blockdev writev readv block ...passed 00:06:36.868 Test: blockdev writev readv size > 128k ...passed 00:06:36.868 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.868 Test: blockdev comparev and writev ...passed 00:06:36.868 Test: blockdev nvme passthru rw ...passed 00:06:36.868 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.868 Test: blockdev nvme admin passthru ...passed 00:06:36.868 Test: blockdev copy ...passed 00:06:36.868 Suite: bdevio tests on: Malloc2p0 00:06:36.868 Test: blockdev write read block ...passed 00:06:36.868 Test: blockdev write zeroes read block ...passed 00:06:36.868 Test: blockdev write zeroes read no split ...passed 00:06:36.868 Test: blockdev write zeroes read split ...passed 00:06:36.868 Test: blockdev write zeroes read split partial ...passed 00:06:36.868 Test: blockdev reset ...passed 00:06:36.868 Test: blockdev write read 8 blocks ...passed 00:06:36.868 Test: blockdev write read size > 128k ...passed 00:06:36.868 Test: blockdev write read invalid size ...passed 00:06:36.868 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.868 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.868 Test: blockdev write read max offset ...passed 00:06:36.868 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.869 Test: blockdev writev readv 8 blocks ...passed 00:06:36.869 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.869 Test: blockdev writev readv block ...passed 00:06:36.869 Test: blockdev writev readv size > 128k ...passed 00:06:36.869 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.869 Test: blockdev comparev and writev ...passed 00:06:36.869 Test: blockdev nvme passthru rw ...passed 00:06:36.869 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.869 Test: blockdev nvme admin passthru ...passed 00:06:36.869 Test: blockdev copy ...passed 00:06:36.869 Suite: bdevio tests on: Malloc1p1 00:06:36.869 Test: blockdev write read block ...passed 00:06:36.869 Test: blockdev write zeroes read block ...passed 00:06:36.869 Test: blockdev write zeroes read no split ...passed 00:06:36.869 Test: blockdev write zeroes read split ...passed 00:06:36.869 Test: blockdev write zeroes read split partial ...passed 00:06:36.869 Test: blockdev reset ...passed 00:06:36.869 Test: blockdev write read 8 blocks ...passed 00:06:36.869 Test: blockdev write read size > 128k ...passed 00:06:36.869 Test: blockdev write read invalid size ...passed 00:06:36.869 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.869 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.869 Test: blockdev write read max offset ...passed 00:06:36.869 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.869 Test: blockdev writev readv 8 blocks ...passed 00:06:36.869 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.869 Test: blockdev writev readv block ...passed 00:06:36.869 Test: blockdev writev readv size > 128k ...passed 00:06:36.869 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.869 Test: blockdev comparev and writev ...passed 00:06:36.869 Test: blockdev nvme passthru rw ...passed 00:06:36.869 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.869 Test: blockdev nvme admin passthru ...passed 00:06:36.869 Test: blockdev copy ...passed 00:06:36.869 Suite: bdevio tests on: Malloc1p0 00:06:36.869 Test: blockdev write read block ...passed 00:06:36.869 Test: blockdev write zeroes read block ...passed 00:06:36.869 Test: blockdev write zeroes read no split ...passed 00:06:36.869 Test: blockdev write zeroes read split ...passed 00:06:36.869 Test: blockdev write zeroes read split partial ...passed 00:06:36.869 Test: blockdev reset ...passed 00:06:36.869 Test: blockdev write read 8 blocks ...passed 00:06:36.869 Test: blockdev write read size > 128k ...passed 00:06:36.869 Test: blockdev write read invalid size ...passed 00:06:36.869 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.869 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.869 Test: blockdev write read max offset ...passed 00:06:36.869 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.869 Test: blockdev writev readv 8 blocks ...passed 00:06:36.869 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.869 Test: blockdev writev readv block ...passed 00:06:36.869 Test: blockdev writev readv size > 128k ...passed 00:06:36.869 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.869 Test: blockdev comparev and writev ...passed 00:06:36.869 Test: blockdev nvme passthru rw ...passed 00:06:36.869 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.869 Test: blockdev nvme admin passthru ...passed 00:06:36.869 Test: blockdev copy ...passed 00:06:36.869 Suite: bdevio tests on: Malloc0 00:06:36.869 Test: blockdev write read block ...passed 00:06:36.869 Test: blockdev write zeroes read block ...passed 00:06:36.869 Test: blockdev write zeroes read no split ...passed 00:06:36.869 Test: blockdev write zeroes read split ...passed 00:06:36.869 Test: blockdev write zeroes read split partial ...passed 00:06:36.869 Test: blockdev reset ...passed 00:06:36.869 Test: blockdev write read 8 blocks ...passed 00:06:36.869 Test: blockdev write read size > 128k ...passed 00:06:36.869 Test: blockdev write read invalid size ...passed 00:06:36.869 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:36.869 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:36.869 Test: blockdev write read max offset ...passed 00:06:36.869 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:36.869 Test: blockdev writev readv 8 blocks ...passed 00:06:36.869 Test: blockdev writev readv 30 x 1block ...passed 00:06:36.869 Test: blockdev writev readv block ...passed 00:06:36.869 Test: blockdev writev readv size > 128k ...passed 00:06:36.869 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:36.869 Test: blockdev comparev and writev ...passed 00:06:36.869 Test: blockdev nvme passthru rw ...passed 00:06:36.869 Test: blockdev nvme passthru vendor specific ...passed 00:06:36.869 Test: blockdev nvme admin passthru ...passed 00:06:36.869 Test: blockdev copy ...passed 00:06:36.869 00:06:36.869 Run Summary: Type Total Ran Passed Failed Inactive 00:06:36.869 suites 16 16 n/a 0 0 00:06:36.869 tests 368 368 368 0 0 00:06:36.869 asserts 2224 2224 2224 0 n/a 00:06:36.869 00:06:36.869 Elapsed time = 0.566 seconds 00:06:36.869 0 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3808468 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 3808468 ']' 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 3808468 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3808468 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3808468' 00:06:36.869 killing process with pid 3808468 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@965 -- # kill 3808468 00:06:36.869 04:07:24 blockdev_general.bdev_bounds -- common/autotest_common.sh@970 -- # wait 3808468 00:06:37.435 04:07:25 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:06:37.435 00:06:37.435 real 0m1.791s 00:06:37.435 user 0m4.475s 00:06:37.435 sys 0m0.460s 00:06:37.435 04:07:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:37.435 04:07:25 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:37.435 ************************************ 00:06:37.435 END TEST bdev_bounds 00:06:37.435 ************************************ 00:06:37.435 04:07:25 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:06:37.435 04:07:25 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:06:37.435 04:07:25 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:37.435 04:07:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:06:37.435 ************************************ 00:06:37.435 START TEST bdev_nbd 00:06:37.435 ************************************ 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3808644 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3808644 /var/tmp/spdk-nbd.sock 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 3808644 ']' 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:37.435 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:37.435 04:07:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:06:37.435 [2024-05-15 04:07:25.331667] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:06:37.435 [2024-05-15 04:07:25.331735] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:37.435 [2024-05-15 04:07:25.410270] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.693 [2024-05-15 04:07:25.521720] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.694 [2024-05-15 04:07:25.692254] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:37.694 [2024-05-15 04:07:25.692335] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:06:37.694 [2024-05-15 04:07:25.692354] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:06:37.694 [2024-05-15 04:07:25.700260] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:37.694 [2024-05-15 04:07:25.700295] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:06:37.694 [2024-05-15 04:07:25.708266] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:37.694 [2024-05-15 04:07:25.708298] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:06:37.952 [2024-05-15 04:07:25.793086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:06:37.952 [2024-05-15 04:07:25.793163] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:37.952 [2024-05-15 04:07:25.793188] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10fae50 00:06:37.952 [2024-05-15 04:07:25.793203] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:37.952 [2024-05-15 04:07:25.795048] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:37.952 [2024-05-15 04:07:25.795080] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:06:38.517 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:06:38.518 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:38.518 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:38.776 1+0 records in 00:06:38.776 1+0 records out 00:06:38.776 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000179109 s, 22.9 MB/s 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:38.776 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:39.034 1+0 records in 00:06:39.034 1+0 records out 00:06:39.034 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202688 s, 20.2 MB/s 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:39.034 04:07:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:39.293 1+0 records in 00:06:39.293 1+0 records out 00:06:39.293 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225118 s, 18.2 MB/s 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:39.293 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:39.551 1+0 records in 00:06:39.551 1+0 records out 00:06:39.551 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233139 s, 17.6 MB/s 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:39.551 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:39.810 1+0 records in 00:06:39.810 1+0 records out 00:06:39.810 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237445 s, 17.3 MB/s 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:39.810 04:07:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:40.069 1+0 records in 00:06:40.069 1+0 records out 00:06:40.069 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298712 s, 13.7 MB/s 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:40.069 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:40.327 1+0 records in 00:06:40.327 1+0 records out 00:06:40.327 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260286 s, 15.7 MB/s 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:40.327 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:40.893 1+0 records in 00:06:40.893 1+0 records out 00:06:40.893 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311085 s, 13.2 MB/s 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:40.893 1+0 records in 00:06:40.893 1+0 records out 00:06:40.893 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000478096 s, 8.6 MB/s 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:40.893 04:07:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:41.151 1+0 records in 00:06:41.151 1+0 records out 00:06:41.151 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319564 s, 12.8 MB/s 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:41.151 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:41.409 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:41.410 1+0 records in 00:06:41.410 1+0 records out 00:06:41.410 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000419056 s, 9.8 MB/s 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:41.410 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:41.975 1+0 records in 00:06:41.975 1+0 records out 00:06:41.975 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000360694 s, 11.4 MB/s 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:41.975 1+0 records in 00:06:41.975 1+0 records out 00:06:41.975 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000381418 s, 10.7 MB/s 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:41.975 04:07:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:42.233 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:42.233 1+0 records in 00:06:42.233 1+0 records out 00:06:42.233 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000519241 s, 7.9 MB/s 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:42.491 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:06:42.749 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:06:42.749 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:06:42.749 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:42.750 1+0 records in 00:06:42.750 1+0 records out 00:06:42.750 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000385185 s, 10.6 MB/s 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:42.750 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:43.008 1+0 records in 00:06:43.008 1+0 records out 00:06:43.008 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000527148 s, 7.8 MB/s 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:06:43.008 04:07:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd0", 00:06:43.266 "bdev_name": "Malloc0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd1", 00:06:43.266 "bdev_name": "Malloc1p0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd2", 00:06:43.266 "bdev_name": "Malloc1p1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd3", 00:06:43.266 "bdev_name": "Malloc2p0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd4", 00:06:43.266 "bdev_name": "Malloc2p1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd5", 00:06:43.266 "bdev_name": "Malloc2p2" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd6", 00:06:43.266 "bdev_name": "Malloc2p3" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd7", 00:06:43.266 "bdev_name": "Malloc2p4" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd8", 00:06:43.266 "bdev_name": "Malloc2p5" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd9", 00:06:43.266 "bdev_name": "Malloc2p6" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd10", 00:06:43.266 "bdev_name": "Malloc2p7" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd11", 00:06:43.266 "bdev_name": "TestPT" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd12", 00:06:43.266 "bdev_name": "raid0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd13", 00:06:43.266 "bdev_name": "concat0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd14", 00:06:43.266 "bdev_name": "raid1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd15", 00:06:43.266 "bdev_name": "AIO0" 00:06:43.266 } 00:06:43.266 ]' 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd0", 00:06:43.266 "bdev_name": "Malloc0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd1", 00:06:43.266 "bdev_name": "Malloc1p0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd2", 00:06:43.266 "bdev_name": "Malloc1p1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd3", 00:06:43.266 "bdev_name": "Malloc2p0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd4", 00:06:43.266 "bdev_name": "Malloc2p1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd5", 00:06:43.266 "bdev_name": "Malloc2p2" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd6", 00:06:43.266 "bdev_name": "Malloc2p3" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd7", 00:06:43.266 "bdev_name": "Malloc2p4" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd8", 00:06:43.266 "bdev_name": "Malloc2p5" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd9", 00:06:43.266 "bdev_name": "Malloc2p6" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd10", 00:06:43.266 "bdev_name": "Malloc2p7" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd11", 00:06:43.266 "bdev_name": "TestPT" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd12", 00:06:43.266 "bdev_name": "raid0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd13", 00:06:43.266 "bdev_name": "concat0" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd14", 00:06:43.266 "bdev_name": "raid1" 00:06:43.266 }, 00:06:43.266 { 00:06:43.266 "nbd_device": "/dev/nbd15", 00:06:43.266 "bdev_name": "AIO0" 00:06:43.266 } 00:06:43.266 ]' 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:43.266 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:43.524 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:43.782 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:06:44.040 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:06:44.040 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:44.041 04:07:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:44.299 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:44.557 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:44.815 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:45.073 04:07:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:45.331 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:45.593 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:45.852 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:46.109 04:07:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:46.366 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:06:46.367 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:46.367 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:46.367 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:46.367 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:46.624 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:46.882 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:47.140 04:07:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:47.397 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:06:47.654 04:07:35 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:47.655 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:47.913 /dev/nbd0 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:47.913 1+0 records in 00:06:47.913 1+0 records out 00:06:47.913 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000196879 s, 20.8 MB/s 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:47.913 04:07:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:06:48.169 /dev/nbd1 00:06:48.169 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:48.169 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:48.169 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:48.169 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:48.169 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:48.170 1+0 records in 00:06:48.170 1+0 records out 00:06:48.170 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211248 s, 19.4 MB/s 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:48.170 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:06:48.426 /dev/nbd10 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:48.426 1+0 records in 00:06:48.426 1+0 records out 00:06:48.426 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224352 s, 18.3 MB/s 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:48.426 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:06:48.684 /dev/nbd11 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:48.684 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:48.684 1+0 records in 00:06:48.684 1+0 records out 00:06:48.684 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000261792 s, 15.6 MB/s 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:48.942 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:06:49.200 /dev/nbd12 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd12 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd12 /proc/partitions 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:49.200 1+0 records in 00:06:49.200 1+0 records out 00:06:49.200 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299401 s, 13.7 MB/s 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:49.200 04:07:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:06:49.459 /dev/nbd13 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd13 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd13 /proc/partitions 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:49.459 1+0 records in 00:06:49.459 1+0 records out 00:06:49.459 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028791 s, 14.2 MB/s 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:49.459 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:06:49.718 /dev/nbd14 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd14 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd14 /proc/partitions 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:49.718 1+0 records in 00:06:49.718 1+0 records out 00:06:49.718 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280554 s, 14.6 MB/s 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:49.718 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:06:49.976 /dev/nbd15 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd15 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd15 /proc/partitions 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:49.976 1+0 records in 00:06:49.976 1+0 records out 00:06:49.976 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00037744 s, 10.9 MB/s 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:49.976 04:07:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:06:50.234 /dev/nbd2 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:50.234 1+0 records in 00:06:50.234 1+0 records out 00:06:50.234 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000378785 s, 10.8 MB/s 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:50.234 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:06:50.493 /dev/nbd3 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:50.493 1+0 records in 00:06:50.493 1+0 records out 00:06:50.493 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306981 s, 13.3 MB/s 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:50.493 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:06:50.751 /dev/nbd4 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd4 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd4 /proc/partitions 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:50.751 1+0 records in 00:06:50.751 1+0 records out 00:06:50.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000413949 s, 9.9 MB/s 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:50.751 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:06:51.010 /dev/nbd5 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd5 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd5 /proc/partitions 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:51.010 1+0 records in 00:06:51.010 1+0 records out 00:06:51.010 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000387139 s, 10.6 MB/s 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:51.010 04:07:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:06:51.268 /dev/nbd6 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd6 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd6 /proc/partitions 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:51.268 1+0 records in 00:06:51.268 1+0 records out 00:06:51.268 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000479003 s, 8.6 MB/s 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:51.268 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:06:51.526 /dev/nbd7 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd7 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd7 /proc/partitions 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:51.526 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:51.526 1+0 records in 00:06:51.526 1+0 records out 00:06:51.526 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000413786 s, 9.9 MB/s 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:51.527 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:06:51.785 /dev/nbd8 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd8 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd8 /proc/partitions 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:51.785 1+0 records in 00:06:51.785 1+0 records out 00:06:51.785 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000561699 s, 7.3 MB/s 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:51.785 04:07:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:06:52.043 /dev/nbd9 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd9 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd9 /proc/partitions 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:52.043 1+0 records in 00:06:52.043 1+0 records out 00:06:52.043 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000518003 s, 7.9 MB/s 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:52.043 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:52.301 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd0", 00:06:52.301 "bdev_name": "Malloc0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd1", 00:06:52.301 "bdev_name": "Malloc1p0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd10", 00:06:52.301 "bdev_name": "Malloc1p1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd11", 00:06:52.301 "bdev_name": "Malloc2p0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd12", 00:06:52.301 "bdev_name": "Malloc2p1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd13", 00:06:52.301 "bdev_name": "Malloc2p2" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd14", 00:06:52.301 "bdev_name": "Malloc2p3" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd15", 00:06:52.301 "bdev_name": "Malloc2p4" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd2", 00:06:52.301 "bdev_name": "Malloc2p5" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd3", 00:06:52.301 "bdev_name": "Malloc2p6" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd4", 00:06:52.301 "bdev_name": "Malloc2p7" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd5", 00:06:52.301 "bdev_name": "TestPT" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd6", 00:06:52.301 "bdev_name": "raid0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd7", 00:06:52.301 "bdev_name": "concat0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd8", 00:06:52.301 "bdev_name": "raid1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd9", 00:06:52.301 "bdev_name": "AIO0" 00:06:52.301 } 00:06:52.301 ]' 00:06:52.301 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd0", 00:06:52.301 "bdev_name": "Malloc0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd1", 00:06:52.301 "bdev_name": "Malloc1p0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd10", 00:06:52.301 "bdev_name": "Malloc1p1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd11", 00:06:52.301 "bdev_name": "Malloc2p0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd12", 00:06:52.301 "bdev_name": "Malloc2p1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd13", 00:06:52.301 "bdev_name": "Malloc2p2" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd14", 00:06:52.301 "bdev_name": "Malloc2p3" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd15", 00:06:52.301 "bdev_name": "Malloc2p4" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd2", 00:06:52.301 "bdev_name": "Malloc2p5" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd3", 00:06:52.301 "bdev_name": "Malloc2p6" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd4", 00:06:52.301 "bdev_name": "Malloc2p7" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd5", 00:06:52.301 "bdev_name": "TestPT" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd6", 00:06:52.301 "bdev_name": "raid0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd7", 00:06:52.301 "bdev_name": "concat0" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd8", 00:06:52.301 "bdev_name": "raid1" 00:06:52.301 }, 00:06:52.301 { 00:06:52.301 "nbd_device": "/dev/nbd9", 00:06:52.301 "bdev_name": "AIO0" 00:06:52.301 } 00:06:52.301 ]' 00:06:52.301 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:52.560 /dev/nbd1 00:06:52.560 /dev/nbd10 00:06:52.560 /dev/nbd11 00:06:52.560 /dev/nbd12 00:06:52.560 /dev/nbd13 00:06:52.560 /dev/nbd14 00:06:52.560 /dev/nbd15 00:06:52.560 /dev/nbd2 00:06:52.560 /dev/nbd3 00:06:52.560 /dev/nbd4 00:06:52.560 /dev/nbd5 00:06:52.560 /dev/nbd6 00:06:52.560 /dev/nbd7 00:06:52.560 /dev/nbd8 00:06:52.560 /dev/nbd9' 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:52.560 /dev/nbd1 00:06:52.560 /dev/nbd10 00:06:52.560 /dev/nbd11 00:06:52.560 /dev/nbd12 00:06:52.560 /dev/nbd13 00:06:52.560 /dev/nbd14 00:06:52.560 /dev/nbd15 00:06:52.560 /dev/nbd2 00:06:52.560 /dev/nbd3 00:06:52.560 /dev/nbd4 00:06:52.560 /dev/nbd5 00:06:52.560 /dev/nbd6 00:06:52.560 /dev/nbd7 00:06:52.560 /dev/nbd8 00:06:52.560 /dev/nbd9' 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:06:52.560 256+0 records in 00:06:52.560 256+0 records out 00:06:52.560 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00402526 s, 260 MB/s 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:52.560 256+0 records in 00:06:52.560 256+0 records out 00:06:52.560 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.124314 s, 8.4 MB/s 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:52.560 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:52.817 256+0 records in 00:06:52.817 256+0 records out 00:06:52.818 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.11992 s, 8.7 MB/s 00:06:52.818 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:52.818 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:06:52.818 256+0 records in 00:06:52.818 256+0 records out 00:06:52.818 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.117562 s, 8.9 MB/s 00:06:52.818 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:52.818 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:06:53.075 256+0 records in 00:06:53.075 256+0 records out 00:06:53.075 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.119321 s, 8.8 MB/s 00:06:53.075 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.075 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:06:53.075 256+0 records in 00:06:53.075 256+0 records out 00:06:53.075 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.121408 s, 8.6 MB/s 00:06:53.075 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.075 04:07:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:06:53.075 256+0 records in 00:06:53.075 256+0 records out 00:06:53.075 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.116751 s, 9.0 MB/s 00:06:53.075 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.075 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:06:53.338 256+0 records in 00:06:53.338 256+0 records out 00:06:53.338 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.117757 s, 8.9 MB/s 00:06:53.338 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.338 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:06:53.338 256+0 records in 00:06:53.338 256+0 records out 00:06:53.338 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120032 s, 8.7 MB/s 00:06:53.338 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.338 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:06:53.596 256+0 records in 00:06:53.596 256+0 records out 00:06:53.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118285 s, 8.9 MB/s 00:06:53.596 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.596 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:06:53.596 256+0 records in 00:06:53.596 256+0 records out 00:06:53.596 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118906 s, 8.8 MB/s 00:06:53.596 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.596 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:06:53.853 256+0 records in 00:06:53.853 256+0 records out 00:06:53.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118865 s, 8.8 MB/s 00:06:53.853 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.853 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:06:53.853 256+0 records in 00:06:53.853 256+0 records out 00:06:53.853 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118725 s, 8.8 MB/s 00:06:53.853 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:53.853 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:06:54.111 256+0 records in 00:06:54.111 256+0 records out 00:06:54.111 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.126248 s, 8.3 MB/s 00:06:54.111 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:54.111 04:07:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:06:54.111 256+0 records in 00:06:54.111 256+0 records out 00:06:54.111 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.120908 s, 8.7 MB/s 00:06:54.111 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:54.111 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:06:54.368 256+0 records in 00:06:54.368 256+0 records out 00:06:54.368 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.129699 s, 8.1 MB/s 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:06:54.368 256+0 records in 00:06:54.368 256+0 records out 00:06:54.368 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.118827 s, 8.8 MB/s 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.368 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:54.626 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:54.889 04:07:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:55.196 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:55.477 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:55.733 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:06:55.990 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:55.990 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:55.990 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:55.990 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:06:55.990 04:07:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:06:55.990 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:06:55.990 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:06:55.990 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:55.990 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:55.990 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:06:56.247 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:56.247 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:56.247 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.247 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.504 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:56.761 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.762 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.018 04:07:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.274 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.275 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.275 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.532 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.789 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:58.064 04:07:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:58.321 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:58.579 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.836 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:06:59.095 04:07:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:06:59.353 malloc_lvol_verify 00:06:59.353 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:06:59.611 cd70bc89-a4cf-450b-b486-911bfbd5c44a 00:06:59.611 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:06:59.869 766cfe13-950f-45d9-9d2d-1ce69808ec63 00:06:59.869 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:00.127 /dev/nbd0 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:00.127 mke2fs 1.46.5 (30-Dec-2021) 00:07:00.127 Discarding device blocks: 0/4096 done 00:07:00.127 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:00.127 00:07:00.127 Allocating group tables: 0/1 done 00:07:00.127 Writing inode tables: 0/1 done 00:07:00.127 Creating journal (1024 blocks): done 00:07:00.127 Writing superblocks and filesystem accounting information: 0/1 done 00:07:00.127 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:00.127 04:07:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3808644 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 3808644 ']' 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 3808644 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3808644 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3808644' 00:07:00.384 killing process with pid 3808644 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@965 -- # kill 3808644 00:07:00.384 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@970 -- # wait 3808644 00:07:00.642 04:07:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:07:00.642 00:07:00.642 real 0m23.332s 00:07:00.642 user 0m30.465s 00:07:00.642 sys 0m11.963s 00:07:00.642 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:00.642 04:07:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:00.642 ************************************ 00:07:00.642 END TEST bdev_nbd 00:07:00.642 ************************************ 00:07:00.642 04:07:48 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:07:00.642 04:07:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:07:00.642 04:07:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:07:00.642 04:07:48 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:07:00.642 04:07:48 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:07:00.642 04:07:48 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:00.642 04:07:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:00.899 ************************************ 00:07:00.899 START TEST bdev_fio 00:07:00.899 ************************************ 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:00.899 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:07:00.899 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:00.900 04:07:48 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:00.900 ************************************ 00:07:00.900 START TEST bdev_fio_rw_verify 00:07:00.900 ************************************ 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:00.900 04:07:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:01.159 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:01.159 fio-3.35 00:07:01.159 Starting 16 threads 00:07:13.357 00:07:13.357 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=3812213: Wed May 15 04:07:59 2024 00:07:13.357 read: IOPS=104k, BW=405MiB/s (425MB/s)(4049MiB/10001msec) 00:07:13.357 slat (usec): min=2, max=1170, avg=29.34, stdev=10.80 00:07:13.357 clat (usec): min=10, max=1508, avg=249.64, stdev=106.90 00:07:13.357 lat (usec): min=20, max=1543, avg=278.98, stdev=111.60 00:07:13.357 clat percentiles (usec): 00:07:13.357 | 50.000th=[ 247], 99.000th=[ 465], 99.900th=[ 578], 99.990th=[ 717], 00:07:13.357 | 99.999th=[ 979] 00:07:13.357 write: IOPS=163k, BW=638MiB/s (669MB/s)(6298MiB/9872msec); 0 zone resets 00:07:13.357 slat (usec): min=5, max=3386, avg=42.16, stdev=11.45 00:07:13.357 clat (usec): min=11, max=3751, avg=297.76, stdev=126.87 00:07:13.357 lat (usec): min=31, max=3796, avg=339.92, stdev=131.63 00:07:13.357 clat percentiles (usec): 00:07:13.357 | 50.000th=[ 289], 99.000th=[ 611], 99.900th=[ 758], 99.990th=[ 865], 00:07:13.357 | 99.999th=[ 1029] 00:07:13.357 bw ( KiB/s): min=551704, max=810943, per=98.75%, avg=645123.58, stdev=4273.40, samples=304 00:07:13.357 iops : min=137926, max=202732, avg=161280.74, stdev=1068.33, samples=304 00:07:13.357 lat (usec) : 20=0.01%, 50=0.58%, 100=5.37%, 250=38.55%, 500=52.42% 00:07:13.357 lat (usec) : 750=2.98%, 1000=0.08% 00:07:13.357 lat (msec) : 2=0.01%, 4=0.01% 00:07:13.357 cpu : usr=98.98%, sys=0.49%, ctx=688, majf=0, minf=2720 00:07:13.357 IO depths : 1=12.4%, 2=24.7%, 4=50.3%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:13.357 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:13.357 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:13.357 issued rwts: total=1036565,1612237,0,0 short=0,0,0,0 dropped=0,0,0,0 00:07:13.357 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:13.357 00:07:13.357 Run status group 0 (all jobs): 00:07:13.357 READ: bw=405MiB/s (425MB/s), 405MiB/s-405MiB/s (425MB/s-425MB/s), io=4049MiB (4246MB), run=10001-10001msec 00:07:13.357 WRITE: bw=638MiB/s (669MB/s), 638MiB/s-638MiB/s (669MB/s-669MB/s), io=6298MiB (6604MB), run=9872-9872msec 00:07:13.357 00:07:13.357 real 0m11.528s 00:07:13.357 user 2m41.053s 00:07:13.357 sys 0m1.415s 00:07:13.357 04:08:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:13.357 04:08:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:07:13.357 ************************************ 00:07:13.357 END TEST bdev_fio_rw_verify 00:07:13.357 ************************************ 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:07:13.357 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:13.358 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b56347e6-134d-4bbb-95f6-c7f7b6df45d6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b56347e6-134d-4bbb-95f6-c7f7b6df45d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2dcc76db-36fd-5b20-bed9-abc082d8df21"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2dcc76db-36fd-5b20-bed9-abc082d8df21",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "20ff41aa-41e7-5a1c-b479-82564ee83fa2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20ff41aa-41e7-5a1c-b479-82564ee83fa2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "608c2ea9-0f8f-5b45-ba8e-98352992ad99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "608c2ea9-0f8f-5b45-ba8e-98352992ad99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "73876cfc-f9ee-5012-922c-48fc719deaa0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "73876cfc-f9ee-5012-922c-48fc719deaa0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "84b891af-26e3-565f-ac29-50ceee22b12f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "84b891af-26e3-565f-ac29-50ceee22b12f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5ca11368-9f6e-539a-aa6a-ff35eec02037"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5ca11368-9f6e-539a-aa6a-ff35eec02037",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "c747978b-fd45-5874-9e07-ad8241c954e1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c747978b-fd45-5874-9e07-ad8241c954e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1d666b01-c929-5dfe-999d-30a3e00c7a7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1d666b01-c929-5dfe-999d-30a3e00c7a7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "30ee2496-49d8-5b22-a283-c51bc6755f3a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30ee2496-49d8-5b22-a283-c51bc6755f3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f632768c-1cca-547e-bcc0-b37b5e9cb1c5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f632768c-1cca-547e-bcc0-b37b5e9cb1c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "735f4fb4-44c7-566c-9de4-c5a40e5c79c5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "735f4fb4-44c7-566c-9de4-c5a40e5c79c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "13ffda48-aefe-46b3-87f0-810b433ecc0a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "916715ef-e5e6-49fc-bae4-da39ea36f454",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "3504f656-ff27-4692-9f43-42e77fdd1998",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "c0ab1585-622f-47a6-b356-eacf0a81ff84"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "a14bb9d4-085b-4d18-add2-a8f2cef9d25e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "dec0015f-a1af-4b8f-9a2a-62fc58c0abd1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "68628006-718a-494c-a1ef-c0c7e9bd1a8f"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "311a4465-bc7a-4fea-aa4a-77aa49fd36d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e40748dc-5932-4de0-8806-ae3983b8441c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f47942cf-9af6-4096-aa3f-a4afd351e1fc"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f47942cf-9af6-4096-aa3f-a4afd351e1fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:13.358 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:07:13.358 Malloc1p0 00:07:13.358 Malloc1p1 00:07:13.358 Malloc2p0 00:07:13.358 Malloc2p1 00:07:13.358 Malloc2p2 00:07:13.358 Malloc2p3 00:07:13.358 Malloc2p4 00:07:13.358 Malloc2p5 00:07:13.358 Malloc2p6 00:07:13.358 Malloc2p7 00:07:13.358 TestPT 00:07:13.358 raid0 00:07:13.358 concat0 ]] 00:07:13.359 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "b56347e6-134d-4bbb-95f6-c7f7b6df45d6"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b56347e6-134d-4bbb-95f6-c7f7b6df45d6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2dcc76db-36fd-5b20-bed9-abc082d8df21"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2dcc76db-36fd-5b20-bed9-abc082d8df21",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "20ff41aa-41e7-5a1c-b479-82564ee83fa2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "20ff41aa-41e7-5a1c-b479-82564ee83fa2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "608c2ea9-0f8f-5b45-ba8e-98352992ad99"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "608c2ea9-0f8f-5b45-ba8e-98352992ad99",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "73876cfc-f9ee-5012-922c-48fc719deaa0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "73876cfc-f9ee-5012-922c-48fc719deaa0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "84b891af-26e3-565f-ac29-50ceee22b12f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "84b891af-26e3-565f-ac29-50ceee22b12f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "5ca11368-9f6e-539a-aa6a-ff35eec02037"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "5ca11368-9f6e-539a-aa6a-ff35eec02037",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "c747978b-fd45-5874-9e07-ad8241c954e1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "c747978b-fd45-5874-9e07-ad8241c954e1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1d666b01-c929-5dfe-999d-30a3e00c7a7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1d666b01-c929-5dfe-999d-30a3e00c7a7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "30ee2496-49d8-5b22-a283-c51bc6755f3a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "30ee2496-49d8-5b22-a283-c51bc6755f3a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "f632768c-1cca-547e-bcc0-b37b5e9cb1c5"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f632768c-1cca-547e-bcc0-b37b5e9cb1c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "735f4fb4-44c7-566c-9de4-c5a40e5c79c5"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "735f4fb4-44c7-566c-9de4-c5a40e5c79c5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "13ffda48-aefe-46b3-87f0-810b433ecc0a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "13ffda48-aefe-46b3-87f0-810b433ecc0a",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "916715ef-e5e6-49fc-bae4-da39ea36f454",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "3504f656-ff27-4692-9f43-42e77fdd1998",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "c0ab1585-622f-47a6-b356-eacf0a81ff84"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "c0ab1585-622f-47a6-b356-eacf0a81ff84",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "a14bb9d4-085b-4d18-add2-a8f2cef9d25e",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "dec0015f-a1af-4b8f-9a2a-62fc58c0abd1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "68628006-718a-494c-a1ef-c0c7e9bd1a8f"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "68628006-718a-494c-a1ef-c0c7e9bd1a8f",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "311a4465-bc7a-4fea-aa4a-77aa49fd36d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e40748dc-5932-4de0-8806-ae3983b8441c",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "f47942cf-9af6-4096-aa3f-a4afd351e1fc"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "f47942cf-9af6-4096-aa3f-a4afd351e1fc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:13.360 04:08:00 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:13.360 ************************************ 00:07:13.360 START TEST bdev_fio_trim 00:07:13.360 ************************************ 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:13.360 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:07:13.361 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:07:13.361 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:07:13.361 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:07:13.361 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:13.361 04:08:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:13.361 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:13.361 fio-3.35 00:07:13.361 Starting 14 threads 00:07:25.557 00:07:25.557 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=3813658: Wed May 15 04:08:11 2024 00:07:25.557 write: IOPS=151k, BW=590MiB/s (618MB/s)(5897MiB/10001msec); 0 zone resets 00:07:25.557 slat (usec): min=2, max=915, avg=32.33, stdev= 7.56 00:07:25.557 clat (usec): min=28, max=3843, avg=231.65, stdev=75.40 00:07:25.557 lat (usec): min=39, max=3878, avg=263.98, stdev=77.56 00:07:25.557 clat percentiles (usec): 00:07:25.557 | 50.000th=[ 225], 99.000th=[ 379], 99.900th=[ 457], 99.990th=[ 519], 00:07:25.557 | 99.999th=[ 832] 00:07:25.557 bw ( KiB/s): min=565792, max=786714, per=100.00%, avg=604848.95, stdev=4120.69, samples=266 00:07:25.557 iops : min=141448, max=196677, avg=151212.05, stdev=1030.14, samples=266 00:07:25.557 trim: IOPS=151k, BW=590MiB/s (618MB/s)(5897MiB/10001msec); 0 zone resets 00:07:25.557 slat (usec): min=4, max=121, avg=22.43, stdev= 5.37 00:07:25.557 clat (usec): min=4, max=3878, avg=260.27, stdev=81.18 00:07:25.557 lat (usec): min=15, max=3901, avg=282.70, stdev=83.49 00:07:25.557 clat percentiles (usec): 00:07:25.557 | 50.000th=[ 258], 99.000th=[ 416], 99.900th=[ 453], 99.990th=[ 510], 00:07:25.557 | 99.999th=[ 603] 00:07:25.557 bw ( KiB/s): min=565792, max=786714, per=100.00%, avg=604848.95, stdev=4120.75, samples=266 00:07:25.557 iops : min=141448, max=196677, avg=151212.16, stdev=1030.16, samples=266 00:07:25.557 lat (usec) : 10=0.02%, 20=0.04%, 50=0.14%, 100=2.00%, 250=51.07% 00:07:25.557 lat (usec) : 500=46.72%, 750=0.02%, 1000=0.01% 00:07:25.557 lat (msec) : 2=0.01%, 4=0.01% 00:07:25.557 cpu : usr=99.48%, sys=0.01%, ctx=575, majf=0, minf=1024 00:07:25.557 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:07:25.557 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:25.557 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:07:25.557 issued rwts: total=0,1509640,1509647,0 short=0,0,0,0 dropped=0,0,0,0 00:07:25.557 latency : target=0, window=0, percentile=100.00%, depth=8 00:07:25.557 00:07:25.557 Run status group 0 (all jobs): 00:07:25.557 WRITE: bw=590MiB/s (618MB/s), 590MiB/s-590MiB/s (618MB/s-618MB/s), io=5897MiB (6183MB), run=10001-10001msec 00:07:25.557 TRIM: bw=590MiB/s (618MB/s), 590MiB/s-590MiB/s (618MB/s-618MB/s), io=5897MiB (6184MB), run=10001-10001msec 00:07:25.557 00:07:25.557 real 0m11.364s 00:07:25.557 user 2m21.601s 00:07:25.557 sys 0m0.583s 00:07:25.557 04:08:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.557 04:08:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:07:25.557 ************************************ 00:07:25.557 END TEST bdev_fio_trim 00:07:25.557 ************************************ 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:07:25.557 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:07:25.557 00:07:25.557 real 0m23.166s 00:07:25.557 user 5m2.821s 00:07:25.557 sys 0m2.114s 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.557 04:08:11 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:25.557 ************************************ 00:07:25.557 END TEST bdev_fio 00:07:25.557 ************************************ 00:07:25.557 04:08:11 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:25.557 04:08:11 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:25.557 04:08:11 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:07:25.557 04:08:11 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:25.557 04:08:11 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:25.557 ************************************ 00:07:25.557 START TEST bdev_verify 00:07:25.557 ************************************ 00:07:25.557 04:08:11 blockdev_general.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:25.557 [2024-05-15 04:08:11.934683] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:25.557 [2024-05-15 04:08:11.934751] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3815009 ] 00:07:25.557 [2024-05-15 04:08:12.015951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:25.557 [2024-05-15 04:08:12.140277] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:25.557 [2024-05-15 04:08:12.140282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.557 [2024-05-15 04:08:12.312586] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:25.557 [2024-05-15 04:08:12.312675] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:25.557 [2024-05-15 04:08:12.312697] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:25.557 [2024-05-15 04:08:12.320583] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:25.557 [2024-05-15 04:08:12.320619] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:25.557 [2024-05-15 04:08:12.328593] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:25.557 [2024-05-15 04:08:12.328627] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:25.557 [2024-05-15 04:08:12.411985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:25.557 [2024-05-15 04:08:12.412051] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:25.557 [2024-05-15 04:08:12.412075] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1750fd0 00:07:25.557 [2024-05-15 04:08:12.412089] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:25.557 [2024-05-15 04:08:12.413786] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:25.557 [2024-05-15 04:08:12.413817] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:25.557 Running I/O for 5 seconds... 00:07:30.818 00:07:30.818 Latency(us) 00:07:30.818 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:30.818 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x1000 00:07:30.818 Malloc0 : 5.06 1315.54 5.14 0.00 0.00 97105.24 458.15 340204.66 00:07:30.818 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x1000 length 0x1000 00:07:30.818 Malloc0 : 5.15 1316.94 5.14 0.00 0.00 96999.50 552.20 383701.14 00:07:30.818 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x800 00:07:30.818 Malloc1p0 : 5.21 688.16 2.69 0.00 0.00 185099.59 3301.07 194180.74 00:07:30.818 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x800 length 0x800 00:07:30.818 Malloc1p0 : 5.15 695.46 2.72 0.00 0.00 183166.01 3276.80 194957.46 00:07:30.818 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x800 00:07:30.818 Malloc1p1 : 5.21 687.89 2.69 0.00 0.00 184692.37 3325.35 191073.85 00:07:30.818 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x800 length 0x800 00:07:30.818 Malloc1p1 : 5.16 695.20 2.72 0.00 0.00 182733.38 3349.62 191850.57 00:07:30.818 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p0 : 5.21 687.62 2.69 0.00 0.00 184316.06 3301.07 184860.07 00:07:30.818 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p0 : 5.16 694.93 2.71 0.00 0.00 182361.90 3301.07 185636.79 00:07:30.818 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p1 : 5.21 687.35 2.68 0.00 0.00 183927.39 3325.35 179423.00 00:07:30.818 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p1 : 5.16 694.67 2.71 0.00 0.00 181967.92 3325.35 179423.00 00:07:30.818 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p2 : 5.22 687.08 2.68 0.00 0.00 183568.38 3301.07 172432.50 00:07:30.818 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p2 : 5.16 694.41 2.71 0.00 0.00 181591.68 3325.35 173209.22 00:07:30.818 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p3 : 5.22 686.80 2.68 0.00 0.00 183172.00 3252.53 169325.61 00:07:30.818 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p3 : 5.16 694.13 2.71 0.00 0.00 181187.92 3252.53 170102.33 00:07:30.818 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p4 : 5.22 686.53 2.68 0.00 0.00 182776.40 3301.07 167772.16 00:07:30.818 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p4 : 5.17 693.85 2.71 0.00 0.00 180792.86 3325.35 168548.88 00:07:30.818 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p5 : 5.22 686.23 2.68 0.00 0.00 182381.02 3228.25 163888.55 00:07:30.818 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p5 : 5.17 693.58 2.71 0.00 0.00 180386.16 3252.53 164665.27 00:07:30.818 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p6 : 5.23 685.77 2.68 0.00 0.00 182065.83 3155.44 162335.10 00:07:30.818 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x200 length 0x200 00:07:30.818 Malloc2p6 : 5.17 693.30 2.71 0.00 0.00 180022.45 3179.71 162335.10 00:07:30.818 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.818 Verification LBA range: start 0x0 length 0x200 00:07:30.818 Malloc2p7 : 5.23 685.26 2.68 0.00 0.00 181757.86 3203.98 157674.76 00:07:30.818 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x200 length 0x200 00:07:30.819 Malloc2p7 : 5.22 710.69 2.78 0.00 0.00 175236.94 3179.71 156898.04 00:07:30.819 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x0 length 0x1000 00:07:30.819 TestPT : 5.25 683.18 2.67 0.00 0.00 181840.89 11456.66 157674.76 00:07:30.819 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x1000 length 0x1000 00:07:30.819 TestPT : 5.23 685.74 2.68 0.00 0.00 181117.59 13495.56 222142.77 00:07:30.819 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x0 length 0x2000 00:07:30.819 raid0 : 5.24 684.45 2.67 0.00 0.00 181004.93 3470.98 146800.64 00:07:30.819 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x2000 length 0x2000 00:07:30.819 raid0 : 5.23 709.71 2.77 0.00 0.00 174535.81 3470.98 137479.96 00:07:30.819 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x0 length 0x2000 00:07:30.819 concat0 : 5.24 683.95 2.67 0.00 0.00 180685.56 3301.07 149130.81 00:07:30.819 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x2000 length 0x2000 00:07:30.819 concat0 : 5.23 709.24 2.77 0.00 0.00 174191.50 3349.62 142140.30 00:07:30.819 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x0 length 0x1000 00:07:30.819 raid1 : 5.24 683.60 2.67 0.00 0.00 180264.94 4102.07 152237.70 00:07:30.819 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x1000 length 0x1000 00:07:30.819 raid1 : 5.24 708.91 2.77 0.00 0.00 173775.62 4053.52 149130.81 00:07:30.819 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x0 length 0x4e2 00:07:30.819 AIO0 : 5.24 683.42 2.67 0.00 0.00 179750.94 1735.49 156898.04 00:07:30.819 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:30.819 Verification LBA range: start 0x4e2 length 0x4e2 00:07:30.819 AIO0 : 5.24 708.45 2.77 0.00 0.00 173347.32 1723.35 155344.59 00:07:30.819 =================================================================================================================== 00:07:30.819 Total : 23402.02 91.41 0.00 0.00 171500.71 458.15 383701.14 00:07:30.819 00:07:30.819 real 0m6.531s 00:07:30.819 user 0m12.100s 00:07:30.819 sys 0m0.401s 00:07:30.819 04:08:18 blockdev_general.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:30.819 04:08:18 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:07:30.819 ************************************ 00:07:30.819 END TEST bdev_verify 00:07:30.819 ************************************ 00:07:30.819 04:08:18 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:30.819 04:08:18 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:07:30.819 04:08:18 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:30.819 04:08:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.819 ************************************ 00:07:30.819 START TEST bdev_verify_big_io 00:07:30.819 ************************************ 00:07:30.819 04:08:18 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:30.819 [2024-05-15 04:08:18.521775] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:30.819 [2024-05-15 04:08:18.521859] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3815840 ] 00:07:30.819 [2024-05-15 04:08:18.603617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:30.819 [2024-05-15 04:08:18.723779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:30.819 [2024-05-15 04:08:18.723784] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.077 [2024-05-15 04:08:18.897765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:31.077 [2024-05-15 04:08:18.897873] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:31.077 [2024-05-15 04:08:18.897891] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:31.077 [2024-05-15 04:08:18.905767] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:31.077 [2024-05-15 04:08:18.905803] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:31.077 [2024-05-15 04:08:18.913775] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:31.077 [2024-05-15 04:08:18.913808] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:31.077 [2024-05-15 04:08:18.999091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:31.077 [2024-05-15 04:08:18.999192] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:31.077 [2024-05-15 04:08:18.999221] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d90fd0 00:07:31.077 [2024-05-15 04:08:18.999237] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:31.077 [2024-05-15 04:08:19.001007] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:31.077 [2024-05-15 04:08:19.001050] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:31.334 [2024-05-15 04:08:19.178466] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.179583] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.181280] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.182335] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.184046] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.185068] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.186729] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.188498] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.189527] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:07:31.334 [2024-05-15 04:08:19.191224] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.192260] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.193931] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.194943] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.196628] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.197648] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.199360] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:07:31.335 [2024-05-15 04:08:19.228236] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:07:31.335 [2024-05-15 04:08:19.230632] bdevperf.c:1831:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:07:31.335 Running I/O for 5 seconds... 00:07:39.443 00:07:39.443 Latency(us) 00:07:39.443 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:39.443 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x100 00:07:39.443 Malloc0 : 5.80 176.70 11.04 0.00 0.00 711530.98 794.93 2100258.89 00:07:39.443 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x100 length 0x100 00:07:39.443 Malloc0 : 5.92 172.90 10.81 0.00 0.00 727579.83 728.18 2125114.03 00:07:39.443 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x80 00:07:39.443 Malloc1p0 : 6.19 62.03 3.88 0.00 0.00 1923753.92 2803.48 3019898.88 00:07:39.443 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x80 length 0x80 00:07:39.443 Malloc1p0 : 6.16 77.98 4.87 0.00 0.00 1542631.85 3179.71 2547651.32 00:07:39.443 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x80 00:07:39.443 Malloc1p1 : 6.56 39.00 2.44 0.00 0.00 2884758.40 1638.40 4971026.96 00:07:39.443 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x80 length 0x80 00:07:39.443 Malloc1p1 : 6.54 39.16 2.45 0.00 0.00 2902119.89 1589.85 5095302.64 00:07:39.443 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p0 : 6.12 28.74 1.80 0.00 0.00 1003387.11 670.53 1789569.71 00:07:39.443 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p0 : 6.10 26.24 1.64 0.00 0.00 1086168.62 694.80 1826852.41 00:07:39.443 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p1 : 6.13 28.73 1.80 0.00 0.00 994850.18 679.63 1764714.57 00:07:39.443 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p1 : 6.10 26.23 1.64 0.00 0.00 1077161.56 713.01 1801997.27 00:07:39.443 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p2 : 6.13 28.73 1.80 0.00 0.00 986629.16 682.67 1739859.44 00:07:39.443 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p2 : 6.10 26.23 1.64 0.00 0.00 1068400.18 694.80 1777142.14 00:07:39.443 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p3 : 6.13 28.72 1.80 0.00 0.00 978435.08 679.63 1715004.30 00:07:39.443 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p3 : 6.10 26.22 1.64 0.00 0.00 1058899.48 691.77 1752287.00 00:07:39.443 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p4 : 6.13 28.71 1.79 0.00 0.00 970243.06 737.28 1690149.17 00:07:39.443 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p4 : 6.16 28.58 1.79 0.00 0.00 974607.19 788.86 1727431.87 00:07:39.443 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p5 : 6.13 28.71 1.79 0.00 0.00 962458.45 725.14 1665294.03 00:07:39.443 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p5 : 6.16 28.58 1.79 0.00 0.00 966128.92 713.01 1702576.73 00:07:39.443 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p6 : 6.13 28.70 1.79 0.00 0.00 954526.02 703.91 1640438.90 00:07:39.443 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p6 : 6.16 28.57 1.79 0.00 0.00 957943.79 719.08 1665294.03 00:07:39.443 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x20 00:07:39.443 Malloc2p7 : 6.13 28.70 1.79 0.00 0.00 945786.82 673.56 1615583.76 00:07:39.443 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x20 length 0x20 00:07:39.443 Malloc2p7 : 6.16 28.57 1.79 0.00 0.00 949583.41 709.97 1640438.90 00:07:39.443 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x100 00:07:39.443 TestPT : 6.69 38.56 2.41 0.00 0.00 2642677.95 95148.56 3951966.44 00:07:39.443 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x100 length 0x100 00:07:39.443 TestPT : 6.63 38.60 2.41 0.00 0.00 2664813.61 71458.51 4001676.71 00:07:39.443 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x200 00:07:39.443 raid0 : 6.57 46.29 2.89 0.00 0.00 2171623.41 1626.26 4424214.00 00:07:39.443 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x200 length 0x200 00:07:39.443 raid0 : 6.48 47.84 2.99 0.00 0.00 2118945.75 1614.13 4548489.67 00:07:39.443 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x200 00:07:39.443 concat0 : 6.69 59.31 3.71 0.00 0.00 1665402.14 1662.67 4250228.05 00:07:39.443 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x200 length 0x200 00:07:39.443 concat0 : 6.54 61.00 3.81 0.00 0.00 1635016.19 1614.13 4374503.73 00:07:39.443 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x0 length 0x100 00:07:39.443 raid1 : 6.74 76.60 4.79 0.00 0.00 1270511.69 2099.58 4101097.24 00:07:39.443 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:39.443 Verification LBA range: start 0x100 length 0x100 00:07:39.444 raid1 : 6.69 59.80 3.74 0.00 0.00 1620907.30 2026.76 4225372.92 00:07:39.444 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:07:39.444 Verification LBA range: start 0x0 length 0x4e 00:07:39.444 AIO0 : 6.74 58.76 3.67 0.00 0.00 984687.30 603.78 2410948.08 00:07:39.444 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:07:39.444 Verification LBA range: start 0x4e length 0x4e 00:07:39.444 AIO0 : 6.73 65.99 4.12 0.00 0.00 873857.86 849.54 2497941.05 00:07:39.444 =================================================================================================================== 00:07:39.444 Total : 1569.46 98.09 0.00 0.00 1344022.31 603.78 5095302.64 00:07:39.444 00:07:39.444 real 0m8.111s 00:07:39.444 user 0m15.206s 00:07:39.444 sys 0m0.444s 00:07:39.444 04:08:26 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:39.444 04:08:26 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:07:39.444 ************************************ 00:07:39.444 END TEST bdev_verify_big_io 00:07:39.444 ************************************ 00:07:39.444 04:08:26 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:39.444 04:08:26 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:39.444 04:08:26 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:39.444 04:08:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:39.444 ************************************ 00:07:39.444 START TEST bdev_write_zeroes 00:07:39.444 ************************************ 00:07:39.444 04:08:26 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:39.444 [2024-05-15 04:08:26.689691] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:39.444 [2024-05-15 04:08:26.689765] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3816789 ] 00:07:39.444 [2024-05-15 04:08:26.769562] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.444 [2024-05-15 04:08:26.892612] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.444 [2024-05-15 04:08:27.063333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.444 [2024-05-15 04:08:27.063412] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:39.444 [2024-05-15 04:08:27.063441] vbdev_passthru.c: 731:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:39.444 [2024-05-15 04:08:27.071331] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.444 [2024-05-15 04:08:27.071367] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:39.444 [2024-05-15 04:08:27.079345] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.444 [2024-05-15 04:08:27.079378] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:39.444 [2024-05-15 04:08:27.164087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:39.444 [2024-05-15 04:08:27.164172] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:39.444 [2024-05-15 04:08:27.164198] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1422e20 00:07:39.444 [2024-05-15 04:08:27.164214] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:39.444 [2024-05-15 04:08:27.166058] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:39.444 [2024-05-15 04:08:27.166089] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:39.444 Running I/O for 1 seconds... 00:07:40.818 00:07:40.818 Latency(us) 00:07:40.818 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:40.818 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.818 Malloc0 : 1.05 5124.89 20.02 0.00 0.00 24983.01 664.46 42913.94 00:07:40.818 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc1p0 : 1.05 5118.07 19.99 0.00 0.00 24969.14 952.70 41943.04 00:07:40.819 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc1p1 : 1.05 5111.31 19.97 0.00 0.00 24941.85 928.43 40972.14 00:07:40.819 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p0 : 1.05 5104.64 19.94 0.00 0.00 24928.82 928.43 40195.41 00:07:40.819 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p1 : 1.05 5097.94 19.91 0.00 0.00 24906.82 934.49 39224.51 00:07:40.819 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p2 : 1.06 5091.80 19.89 0.00 0.00 24890.50 910.22 38447.79 00:07:40.819 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p3 : 1.06 5085.56 19.87 0.00 0.00 24866.12 934.49 37476.88 00:07:40.819 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p4 : 1.06 5079.87 19.84 0.00 0.00 24840.55 934.49 36505.98 00:07:40.819 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p5 : 1.06 5073.65 19.82 0.00 0.00 24819.33 922.36 35729.26 00:07:40.819 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p6 : 1.06 5066.95 19.79 0.00 0.00 24805.48 958.77 34758.35 00:07:40.819 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 Malloc2p7 : 1.06 5060.49 19.77 0.00 0.00 24781.62 940.56 33787.45 00:07:40.819 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 TestPT : 1.06 5053.90 19.74 0.00 0.00 24760.82 958.77 32816.55 00:07:40.819 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 raid0 : 1.07 5046.25 19.71 0.00 0.00 24727.85 1735.49 31068.92 00:07:40.819 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 concat0 : 1.07 5039.19 19.68 0.00 0.00 24657.71 1711.22 29321.29 00:07:40.819 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 raid1 : 1.07 5029.93 19.65 0.00 0.00 24593.24 2682.12 26602.76 00:07:40.819 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:40.819 AIO0 : 1.07 5024.23 19.63 0.00 0.00 24499.65 1080.13 25826.04 00:07:40.819 =================================================================================================================== 00:07:40.819 Total : 81208.66 317.22 0.00 0.00 24810.78 664.46 42913.94 00:07:41.077 00:07:41.077 real 0m2.304s 00:07:41.077 user 0m1.902s 00:07:41.077 sys 0m0.320s 00:07:41.077 04:08:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:41.077 04:08:28 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:07:41.077 ************************************ 00:07:41.077 END TEST bdev_write_zeroes 00:07:41.077 ************************************ 00:07:41.077 04:08:28 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:41.077 04:08:28 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:41.077 04:08:28 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:41.077 04:08:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:41.077 ************************************ 00:07:41.077 START TEST bdev_json_nonenclosed 00:07:41.077 ************************************ 00:07:41.077 04:08:28 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:41.077 [2024-05-15 04:08:29.045006] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:41.077 [2024-05-15 04:08:29.045078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3817070 ] 00:07:41.336 [2024-05-15 04:08:29.127004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.336 [2024-05-15 04:08:29.242892] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.336 [2024-05-15 04:08:29.243005] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:07:41.336 [2024-05-15 04:08:29.243042] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:07:41.336 [2024-05-15 04:08:29.243058] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:41.595 00:07:41.595 real 0m0.383s 00:07:41.595 user 0m0.268s 00:07:41.595 sys 0m0.112s 00:07:41.595 04:08:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:41.595 04:08:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:07:41.595 ************************************ 00:07:41.595 END TEST bdev_json_nonenclosed 00:07:41.595 ************************************ 00:07:41.595 04:08:29 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:41.595 04:08:29 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:41.595 04:08:29 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:41.595 04:08:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:41.595 ************************************ 00:07:41.595 START TEST bdev_json_nonarray 00:07:41.595 ************************************ 00:07:41.595 04:08:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:41.595 [2024-05-15 04:08:29.479699] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:41.595 [2024-05-15 04:08:29.479757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3817102 ] 00:07:41.595 [2024-05-15 04:08:29.559914] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.854 [2024-05-15 04:08:29.682933] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.854 [2024-05-15 04:08:29.683041] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:07:41.854 [2024-05-15 04:08:29.683069] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:07:41.854 [2024-05-15 04:08:29.683084] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:41.854 00:07:41.854 real 0m0.389s 00:07:41.854 user 0m0.277s 00:07:41.854 sys 0m0.109s 00:07:41.854 04:08:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:41.854 04:08:29 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:07:41.854 ************************************ 00:07:41.854 END TEST bdev_json_nonarray 00:07:41.854 ************************************ 00:07:41.854 04:08:29 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:07:41.854 04:08:29 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:07:41.854 04:08:29 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:07:41.854 04:08:29 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:41.854 04:08:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:42.112 ************************************ 00:07:42.112 START TEST bdev_qos 00:07:42.112 ************************************ 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@1121 -- # qos_test_suite '' 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=3817245 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 3817245' 00:07:42.112 Process qos testing pid: 3817245 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 3817245 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@827 -- # '[' -z 3817245 ']' 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:42.112 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:42.112 04:08:29 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:42.112 [2024-05-15 04:08:29.928241] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:07:42.112 [2024-05-15 04:08:29.928327] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3817245 ] 00:07:42.112 [2024-05-15 04:08:30.009624] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.370 [2024-05-15 04:08:30.134275] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # return 0 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:42.936 Malloc_0 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_0 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:42.936 [ 00:07:42.936 { 00:07:42.936 "name": "Malloc_0", 00:07:42.936 "aliases": [ 00:07:42.936 "b7607277-7f3a-41e1-b107-74c5ee3ee052" 00:07:42.936 ], 00:07:42.936 "product_name": "Malloc disk", 00:07:42.936 "block_size": 512, 00:07:42.936 "num_blocks": 262144, 00:07:42.936 "uuid": "b7607277-7f3a-41e1-b107-74c5ee3ee052", 00:07:42.936 "assigned_rate_limits": { 00:07:42.936 "rw_ios_per_sec": 0, 00:07:42.936 "rw_mbytes_per_sec": 0, 00:07:42.936 "r_mbytes_per_sec": 0, 00:07:42.936 "w_mbytes_per_sec": 0 00:07:42.936 }, 00:07:42.936 "claimed": false, 00:07:42.936 "zoned": false, 00:07:42.936 "supported_io_types": { 00:07:42.936 "read": true, 00:07:42.936 "write": true, 00:07:42.936 "unmap": true, 00:07:42.936 "write_zeroes": true, 00:07:42.936 "flush": true, 00:07:42.936 "reset": true, 00:07:42.936 "compare": false, 00:07:42.936 "compare_and_write": false, 00:07:42.936 "abort": true, 00:07:42.936 "nvme_admin": false, 00:07:42.936 "nvme_io": false 00:07:42.936 }, 00:07:42.936 "memory_domains": [ 00:07:42.936 { 00:07:42.936 "dma_device_id": "system", 00:07:42.936 "dma_device_type": 1 00:07:42.936 }, 00:07:42.936 { 00:07:42.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:42.936 "dma_device_type": 2 00:07:42.936 } 00:07:42.936 ], 00:07:42.936 "driver_specific": {} 00:07:42.936 } 00:07:42.936 ] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:42.936 Null_1 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@895 -- # local bdev_name=Null_1 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local i 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:42.936 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:43.193 [ 00:07:43.193 { 00:07:43.193 "name": "Null_1", 00:07:43.193 "aliases": [ 00:07:43.193 "13358bbc-0351-4785-ad82-918a38cb1126" 00:07:43.193 ], 00:07:43.193 "product_name": "Null disk", 00:07:43.193 "block_size": 512, 00:07:43.193 "num_blocks": 262144, 00:07:43.193 "uuid": "13358bbc-0351-4785-ad82-918a38cb1126", 00:07:43.193 "assigned_rate_limits": { 00:07:43.193 "rw_ios_per_sec": 0, 00:07:43.193 "rw_mbytes_per_sec": 0, 00:07:43.193 "r_mbytes_per_sec": 0, 00:07:43.193 "w_mbytes_per_sec": 0 00:07:43.193 }, 00:07:43.193 "claimed": false, 00:07:43.193 "zoned": false, 00:07:43.193 "supported_io_types": { 00:07:43.193 "read": true, 00:07:43.193 "write": true, 00:07:43.193 "unmap": false, 00:07:43.193 "write_zeroes": true, 00:07:43.193 "flush": false, 00:07:43.193 "reset": true, 00:07:43.193 "compare": false, 00:07:43.193 "compare_and_write": false, 00:07:43.193 "abort": true, 00:07:43.193 "nvme_admin": false, 00:07:43.193 "nvme_io": false 00:07:43.193 }, 00:07:43.193 "driver_specific": {} 00:07:43.193 } 00:07:43.193 ] 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # return 0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:07:43.193 04:08:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:07:43.193 Running I/O for 60 seconds... 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 64394.30 257577.19 0.00 0.00 259072.00 0.00 0.00 ' 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=64394.30 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 64394 00:07:48.457 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=64394 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=16000 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 16000 -gt 1000 ']' 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 16000 Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 16000 IOPS Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:48.458 04:08:36 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 ************************************ 00:07:48.458 START TEST bdev_qos_iops 00:07:48.458 ************************************ 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1121 -- # run_qos_test 16000 IOPS Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=16000 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:07:48.458 04:08:36 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 16001.57 64006.26 0.00 0.00 64768.00 0.00 0.00 ' 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=16001.57 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 16001 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=16001 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=14400 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=17600 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 16001 -lt 14400 ']' 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 16001 -gt 17600 ']' 00:07:53.752 00:07:53.752 real 0m5.197s 00:07:53.752 user 0m0.099s 00:07:53.752 sys 0m0.026s 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:53.752 04:08:41 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:07:53.752 ************************************ 00:07:53.752 END TEST bdev_qos_iops 00:07:53.752 ************************************ 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:07:53.752 04:08:41 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 22191.87 88767.50 0.00 0.00 90112.00 0.00 0.00 ' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=90112.00 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 90112 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=90112 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:07:59.012 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:59.013 04:08:46 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:07:59.013 ************************************ 00:07:59.013 START TEST bdev_qos_bw 00:07:59.013 ************************************ 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1121 -- # run_qos_test 8 BANDWIDTH Null_1 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:07:59.013 04:08:46 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2048.56 8194.24 0.00 0.00 8424.00 0.00 0.00 ' 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8424.00 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8424 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8424 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:08:04.272 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8424 -lt 7372 ']' 00:08:04.273 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8424 -gt 9011 ']' 00:08:04.273 00:08:04.273 real 0m5.260s 00:08:04.273 user 0m0.093s 00:08:04.273 sys 0m0.032s 00:08:04.273 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:04.273 04:08:51 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:08:04.273 ************************************ 00:08:04.273 END TEST bdev_qos_bw 00:08:04.273 ************************************ 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:04.273 04:08:51 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:04.273 ************************************ 00:08:04.273 START TEST bdev_qos_ro_bw 00:08:04.273 ************************************ 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1121 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:04.273 04:08:52 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.86 2047.46 0.00 0.00 2052.00 0.00 0.00 ' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:08:09.532 00:08:09.532 real 0m5.153s 00:08:09.532 user 0m0.104s 00:08:09.532 sys 0m0.021s 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:09.532 04:08:57 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:08:09.532 ************************************ 00:08:09.532 END TEST bdev_qos_ro_bw 00:08:09.532 ************************************ 00:08:09.532 04:08:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:08:09.532 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:09.532 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:09.791 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:09.791 04:08:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:08:09.791 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:09.791 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.048 00:08:10.048 Latency(us) 00:08:10.048 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:10.048 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:10.048 Malloc_0 : 26.57 22021.10 86.02 0.00 0.00 11507.14 2111.72 503316.48 00:08:10.048 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:10.048 Null_1 : 26.72 22336.96 87.25 0.00 0.00 11429.69 849.54 153014.42 00:08:10.048 =================================================================================================================== 00:08:10.048 Total : 44358.06 173.27 0.00 0.00 11468.03 849.54 503316.48 00:08:10.048 0 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 3817245 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@946 -- # '[' -z 3817245 ']' 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # kill -0 3817245 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # uname 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3817245 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3817245' 00:08:10.048 killing process with pid 3817245 00:08:10.048 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@965 -- # kill 3817245 00:08:10.048 Received shutdown signal, test time was about 26.762829 seconds 00:08:10.048 00:08:10.049 Latency(us) 00:08:10.049 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:10.049 =================================================================================================================== 00:08:10.049 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:10.049 04:08:57 blockdev_general.bdev_qos -- common/autotest_common.sh@970 -- # wait 3817245 00:08:10.306 04:08:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:08:10.306 00:08:10.306 real 0m28.291s 00:08:10.306 user 0m28.915s 00:08:10.306 sys 0m0.622s 00:08:10.306 04:08:58 blockdev_general.bdev_qos -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:10.306 04:08:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:10.306 ************************************ 00:08:10.306 END TEST bdev_qos 00:08:10.306 ************************************ 00:08:10.306 04:08:58 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:08:10.306 04:08:58 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:10.306 04:08:58 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:10.306 04:08:58 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:10.306 ************************************ 00:08:10.306 START TEST bdev_qd_sampling 00:08:10.306 ************************************ 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1121 -- # qd_sampling_test_suite '' 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=3820670 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 3820670' 00:08:10.306 Process bdev QD sampling period testing pid: 3820670 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 3820670 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@827 -- # '[' -z 3820670 ']' 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:10.306 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:10.306 04:08:58 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:10.306 [2024-05-15 04:08:58.277978] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:10.306 [2024-05-15 04:08:58.278063] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3820670 ] 00:08:10.563 [2024-05-15 04:08:58.360940] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:10.563 [2024-05-15 04:08:58.478868] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:10.563 [2024-05-15 04:08:58.478875] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # return 0 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:11.497 Malloc_QD 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_QD 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local i 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:11.497 [ 00:08:11.497 { 00:08:11.497 "name": "Malloc_QD", 00:08:11.497 "aliases": [ 00:08:11.497 "b8589b76-c1ea-4a71-8cbe-87263ee4852e" 00:08:11.497 ], 00:08:11.497 "product_name": "Malloc disk", 00:08:11.497 "block_size": 512, 00:08:11.497 "num_blocks": 262144, 00:08:11.497 "uuid": "b8589b76-c1ea-4a71-8cbe-87263ee4852e", 00:08:11.497 "assigned_rate_limits": { 00:08:11.497 "rw_ios_per_sec": 0, 00:08:11.497 "rw_mbytes_per_sec": 0, 00:08:11.497 "r_mbytes_per_sec": 0, 00:08:11.497 "w_mbytes_per_sec": 0 00:08:11.497 }, 00:08:11.497 "claimed": false, 00:08:11.497 "zoned": false, 00:08:11.497 "supported_io_types": { 00:08:11.497 "read": true, 00:08:11.497 "write": true, 00:08:11.497 "unmap": true, 00:08:11.497 "write_zeroes": true, 00:08:11.497 "flush": true, 00:08:11.497 "reset": true, 00:08:11.497 "compare": false, 00:08:11.497 "compare_and_write": false, 00:08:11.497 "abort": true, 00:08:11.497 "nvme_admin": false, 00:08:11.497 "nvme_io": false 00:08:11.497 }, 00:08:11.497 "memory_domains": [ 00:08:11.497 { 00:08:11.497 "dma_device_id": "system", 00:08:11.497 "dma_device_type": 1 00:08:11.497 }, 00:08:11.497 { 00:08:11.497 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:11.497 "dma_device_type": 2 00:08:11.497 } 00:08:11.497 ], 00:08:11.497 "driver_specific": {} 00:08:11.497 } 00:08:11.497 ] 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # return 0 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:08:11.497 04:08:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:11.497 Running I/O for 5 seconds... 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:08:13.399 "tick_rate": 2700000000, 00:08:13.399 "ticks": 5254159692155502, 00:08:13.399 "bdevs": [ 00:08:13.399 { 00:08:13.399 "name": "Malloc_QD", 00:08:13.399 "bytes_read": 921743872, 00:08:13.399 "num_read_ops": 225028, 00:08:13.399 "bytes_written": 0, 00:08:13.399 "num_write_ops": 0, 00:08:13.399 "bytes_unmapped": 0, 00:08:13.399 "num_unmap_ops": 0, 00:08:13.399 "bytes_copied": 0, 00:08:13.399 "num_copy_ops": 0, 00:08:13.399 "read_latency_ticks": 2649610777241, 00:08:13.399 "max_read_latency_ticks": 16811775, 00:08:13.399 "min_read_latency_ticks": 538959, 00:08:13.399 "write_latency_ticks": 0, 00:08:13.399 "max_write_latency_ticks": 0, 00:08:13.399 "min_write_latency_ticks": 0, 00:08:13.399 "unmap_latency_ticks": 0, 00:08:13.399 "max_unmap_latency_ticks": 0, 00:08:13.399 "min_unmap_latency_ticks": 0, 00:08:13.399 "copy_latency_ticks": 0, 00:08:13.399 "max_copy_latency_ticks": 0, 00:08:13.399 "min_copy_latency_ticks": 0, 00:08:13.399 "io_error": {}, 00:08:13.399 "queue_depth_polling_period": 10, 00:08:13.399 "queue_depth": 512, 00:08:13.399 "io_time": 20, 00:08:13.399 "weighted_io_time": 10240 00:08:13.399 } 00:08:13.399 ] 00:08:13.399 }' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:13.399 00:08:13.399 Latency(us) 00:08:13.399 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:13.399 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:13.399 Malloc_QD : 1.98 58035.79 226.70 0.00 0.00 4400.36 1019.45 6262.33 00:08:13.399 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:13.399 Malloc_QD : 1.98 59513.69 232.48 0.00 0.00 4291.44 776.72 5801.15 00:08:13.399 =================================================================================================================== 00:08:13.399 Total : 117549.48 459.18 0.00 0.00 4345.18 776.72 6262.33 00:08:13.399 0 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 3820670 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@946 -- # '[' -z 3820670 ']' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # kill -0 3820670 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # uname 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:13.399 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3820670 00:08:13.658 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:13.658 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:13.658 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3820670' 00:08:13.658 killing process with pid 3820670 00:08:13.658 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@965 -- # kill 3820670 00:08:13.658 Received shutdown signal, test time was about 2.039705 seconds 00:08:13.658 00:08:13.658 Latency(us) 00:08:13.658 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:13.658 =================================================================================================================== 00:08:13.658 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:13.658 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@970 -- # wait 3820670 00:08:13.916 04:09:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:08:13.916 00:08:13.916 real 0m3.458s 00:08:13.916 user 0m6.792s 00:08:13.916 sys 0m0.347s 00:08:13.916 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:13.916 04:09:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:08:13.916 ************************************ 00:08:13.916 END TEST bdev_qd_sampling 00:08:13.916 ************************************ 00:08:13.916 04:09:01 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:08:13.916 04:09:01 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:13.916 04:09:01 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:13.916 04:09:01 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:13.916 ************************************ 00:08:13.916 START TEST bdev_error 00:08:13.916 ************************************ 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@1121 -- # error_test_suite '' 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=3821201 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 3821201' 00:08:13.916 Process error testing pid: 3821201 00:08:13.916 04:09:01 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 3821201 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 3821201 ']' 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:13.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:13.916 04:09:01 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:13.917 [2024-05-15 04:09:01.793323] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:13.917 [2024-05-15 04:09:01.793405] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3821201 ] 00:08:13.917 [2024-05-15 04:09:01.877976] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.174 [2024-05-15 04:09:01.997339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:14.740 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:14.740 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:08:14.740 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:14.740 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.740 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:14.999 Dev_1 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:14.999 [ 00:08:14.999 { 00:08:14.999 "name": "Dev_1", 00:08:14.999 "aliases": [ 00:08:14.999 "96cf18a9-b951-4606-a711-cb3afb0a7519" 00:08:14.999 ], 00:08:14.999 "product_name": "Malloc disk", 00:08:14.999 "block_size": 512, 00:08:14.999 "num_blocks": 262144, 00:08:14.999 "uuid": "96cf18a9-b951-4606-a711-cb3afb0a7519", 00:08:14.999 "assigned_rate_limits": { 00:08:14.999 "rw_ios_per_sec": 0, 00:08:14.999 "rw_mbytes_per_sec": 0, 00:08:14.999 "r_mbytes_per_sec": 0, 00:08:14.999 "w_mbytes_per_sec": 0 00:08:14.999 }, 00:08:14.999 "claimed": false, 00:08:14.999 "zoned": false, 00:08:14.999 "supported_io_types": { 00:08:14.999 "read": true, 00:08:14.999 "write": true, 00:08:14.999 "unmap": true, 00:08:14.999 "write_zeroes": true, 00:08:14.999 "flush": true, 00:08:14.999 "reset": true, 00:08:14.999 "compare": false, 00:08:14.999 "compare_and_write": false, 00:08:14.999 "abort": true, 00:08:14.999 "nvme_admin": false, 00:08:14.999 "nvme_io": false 00:08:14.999 }, 00:08:14.999 "memory_domains": [ 00:08:14.999 { 00:08:14.999 "dma_device_id": "system", 00:08:14.999 "dma_device_type": 1 00:08:14.999 }, 00:08:14.999 { 00:08:14.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:14.999 "dma_device_type": 2 00:08:14.999 } 00:08:14.999 ], 00:08:14.999 "driver_specific": {} 00:08:14.999 } 00:08:14.999 ] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:08:14.999 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:14.999 true 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:14.999 Dev_2 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:14.999 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:15.000 [ 00:08:15.000 { 00:08:15.000 "name": "Dev_2", 00:08:15.000 "aliases": [ 00:08:15.000 "8533dad1-114f-43d7-9782-183d933480ac" 00:08:15.000 ], 00:08:15.000 "product_name": "Malloc disk", 00:08:15.000 "block_size": 512, 00:08:15.000 "num_blocks": 262144, 00:08:15.000 "uuid": "8533dad1-114f-43d7-9782-183d933480ac", 00:08:15.000 "assigned_rate_limits": { 00:08:15.000 "rw_ios_per_sec": 0, 00:08:15.000 "rw_mbytes_per_sec": 0, 00:08:15.000 "r_mbytes_per_sec": 0, 00:08:15.000 "w_mbytes_per_sec": 0 00:08:15.000 }, 00:08:15.000 "claimed": false, 00:08:15.000 "zoned": false, 00:08:15.000 "supported_io_types": { 00:08:15.000 "read": true, 00:08:15.000 "write": true, 00:08:15.000 "unmap": true, 00:08:15.000 "write_zeroes": true, 00:08:15.000 "flush": true, 00:08:15.000 "reset": true, 00:08:15.000 "compare": false, 00:08:15.000 "compare_and_write": false, 00:08:15.000 "abort": true, 00:08:15.000 "nvme_admin": false, 00:08:15.000 "nvme_io": false 00:08:15.000 }, 00:08:15.000 "memory_domains": [ 00:08:15.000 { 00:08:15.000 "dma_device_id": "system", 00:08:15.000 "dma_device_type": 1 00:08:15.000 }, 00:08:15.000 { 00:08:15.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:15.000 "dma_device_type": 2 00:08:15.000 } 00:08:15.000 ], 00:08:15.000 "driver_specific": {} 00:08:15.000 } 00:08:15.000 ] 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:08:15.000 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:15.000 04:09:02 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.000 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:08:15.000 04:09:02 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:15.000 Running I/O for 5 seconds... 00:08:15.934 04:09:03 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 3821201 00:08:15.934 04:09:03 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 3821201' 00:08:15.935 Process is existed as continue on error is set. Pid: 3821201 00:08:15.935 04:09:03 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.935 04:09:03 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:15.935 04:09:03 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:15.935 04:09:03 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:08:16.193 Timeout while waiting for response: 00:08:16.193 00:08:16.193 00:08:20.376 00:08:20.376 Latency(us) 00:08:20.376 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:20.376 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:20.376 EE_Dev_1 : 0.91 37297.64 145.69 5.51 0.00 424.98 135.77 728.18 00:08:20.376 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:20.376 Dev_2 : 5.00 81255.03 317.40 0.00 0.00 193.22 73.20 27767.85 00:08:20.376 =================================================================================================================== 00:08:20.376 Total : 118552.67 463.10 5.51 0.00 211.05 73.20 27767.85 00:08:20.943 04:09:08 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 3821201 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@946 -- # '[' -z 3821201 ']' 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # kill -0 3821201 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # uname 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3821201 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3821201' 00:08:20.943 killing process with pid 3821201 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@965 -- # kill 3821201 00:08:20.943 Received shutdown signal, test time was about 5.000000 seconds 00:08:20.943 00:08:20.943 Latency(us) 00:08:20.943 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:20.943 =================================================================================================================== 00:08:20.943 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:20.943 04:09:08 blockdev_general.bdev_error -- common/autotest_common.sh@970 -- # wait 3821201 00:08:21.510 04:09:09 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=3822643 00:08:21.510 04:09:09 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:08:21.510 04:09:09 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 3822643' 00:08:21.510 Process error testing pid: 3822643 00:08:21.510 04:09:09 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 3822643 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@827 -- # '[' -z 3822643 ']' 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:21.510 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:21.510 04:09:09 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:21.510 [2024-05-15 04:09:09.316200] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:21.510 [2024-05-15 04:09:09.316279] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3822643 ] 00:08:21.510 [2024-05-15 04:09:09.391843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.510 [2024-05-15 04:09:09.498267] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # return 0 00:08:22.444 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 Dev_1 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_1 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 [ 00:08:22.444 { 00:08:22.444 "name": "Dev_1", 00:08:22.444 "aliases": [ 00:08:22.444 "44734e1e-1cc1-4d93-99d4-225686c3fd5a" 00:08:22.444 ], 00:08:22.444 "product_name": "Malloc disk", 00:08:22.444 "block_size": 512, 00:08:22.444 "num_blocks": 262144, 00:08:22.444 "uuid": "44734e1e-1cc1-4d93-99d4-225686c3fd5a", 00:08:22.444 "assigned_rate_limits": { 00:08:22.444 "rw_ios_per_sec": 0, 00:08:22.444 "rw_mbytes_per_sec": 0, 00:08:22.444 "r_mbytes_per_sec": 0, 00:08:22.444 "w_mbytes_per_sec": 0 00:08:22.444 }, 00:08:22.444 "claimed": false, 00:08:22.444 "zoned": false, 00:08:22.444 "supported_io_types": { 00:08:22.444 "read": true, 00:08:22.444 "write": true, 00:08:22.444 "unmap": true, 00:08:22.444 "write_zeroes": true, 00:08:22.444 "flush": true, 00:08:22.444 "reset": true, 00:08:22.444 "compare": false, 00:08:22.444 "compare_and_write": false, 00:08:22.444 "abort": true, 00:08:22.444 "nvme_admin": false, 00:08:22.444 "nvme_io": false 00:08:22.444 }, 00:08:22.444 "memory_domains": [ 00:08:22.444 { 00:08:22.444 "dma_device_id": "system", 00:08:22.444 "dma_device_type": 1 00:08:22.444 }, 00:08:22.444 { 00:08:22.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:22.444 "dma_device_type": 2 00:08:22.444 } 00:08:22.444 ], 00:08:22.444 "driver_specific": {} 00:08:22.444 } 00:08:22.444 ] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:08:22.444 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 true 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 Dev_2 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@895 -- # local bdev_name=Dev_2 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local i 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.444 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.444 [ 00:08:22.444 { 00:08:22.444 "name": "Dev_2", 00:08:22.444 "aliases": [ 00:08:22.444 "eceacf5e-9cdf-4248-8c80-f256429c7723" 00:08:22.444 ], 00:08:22.444 "product_name": "Malloc disk", 00:08:22.444 "block_size": 512, 00:08:22.444 "num_blocks": 262144, 00:08:22.444 "uuid": "eceacf5e-9cdf-4248-8c80-f256429c7723", 00:08:22.444 "assigned_rate_limits": { 00:08:22.444 "rw_ios_per_sec": 0, 00:08:22.444 "rw_mbytes_per_sec": 0, 00:08:22.444 "r_mbytes_per_sec": 0, 00:08:22.444 "w_mbytes_per_sec": 0 00:08:22.444 }, 00:08:22.444 "claimed": false, 00:08:22.444 "zoned": false, 00:08:22.444 "supported_io_types": { 00:08:22.444 "read": true, 00:08:22.444 "write": true, 00:08:22.444 "unmap": true, 00:08:22.444 "write_zeroes": true, 00:08:22.445 "flush": true, 00:08:22.445 "reset": true, 00:08:22.445 "compare": false, 00:08:22.445 "compare_and_write": false, 00:08:22.445 "abort": true, 00:08:22.445 "nvme_admin": false, 00:08:22.445 "nvme_io": false 00:08:22.445 }, 00:08:22.445 "memory_domains": [ 00:08:22.445 { 00:08:22.445 "dma_device_id": "system", 00:08:22.445 "dma_device_type": 1 00:08:22.445 }, 00:08:22.445 { 00:08:22.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:22.445 "dma_device_type": 2 00:08:22.445 } 00:08:22.445 ], 00:08:22.445 "driver_specific": {} 00:08:22.445 } 00:08:22.445 ] 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # return 0 00:08:22.445 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:22.445 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 3822643 00:08:22.445 04:09:10 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 3822643 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:22.445 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 3822643 00:08:22.703 Running I/O for 5 seconds... 00:08:22.703 task offset: 113280 on job bdev=EE_Dev_1 fails 00:08:22.703 00:08:22.703 Latency(us) 00:08:22.703 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:22.703 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:22.703 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:08:22.703 EE_Dev_1 : 0.00 26894.87 105.06 6112.47 0.00 394.16 144.88 709.97 00:08:22.703 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:08:22.703 Dev_2 : 0.00 18275.27 71.39 0.00 0.00 625.05 139.57 1152.95 00:08:22.703 =================================================================================================================== 00:08:22.703 Total : 45170.14 176.45 6112.47 0.00 519.39 139.57 1152.95 00:08:22.703 [2024-05-15 04:09:10.520437] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:22.703 request: 00:08:22.703 { 00:08:22.703 "method": "perform_tests", 00:08:22.703 "req_id": 1 00:08:22.703 } 00:08:22.703 Got JSON-RPC error response 00:08:22.703 response: 00:08:22.703 { 00:08:22.703 "code": -32603, 00:08:22.703 "message": "bdevperf failed with error Operation not permitted" 00:08:22.703 } 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:22.962 00:08:22.962 real 0m9.144s 00:08:22.962 user 0m9.543s 00:08:22.962 sys 0m0.768s 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:22.962 04:09:10 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:08:22.962 ************************************ 00:08:22.962 END TEST bdev_error 00:08:22.962 ************************************ 00:08:22.962 04:09:10 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:08:22.962 04:09:10 blockdev_general -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:22.962 04:09:10 blockdev_general -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:22.962 04:09:10 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:22.962 ************************************ 00:08:22.962 START TEST bdev_stat 00:08:22.962 ************************************ 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@1121 -- # stat_test_suite '' 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=3822818 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 3822818' 00:08:22.962 Process Bdev IO statistics testing pid: 3822818 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 3822818 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@827 -- # '[' -z 3822818 ']' 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:22.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:22.962 04:09:10 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:23.221 [2024-05-15 04:09:10.990895] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:23.221 [2024-05-15 04:09:10.990968] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3822818 ] 00:08:23.221 [2024-05-15 04:09:11.067280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:23.221 [2024-05-15 04:09:11.178312] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:08:23.221 [2024-05-15 04:09:11.178315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # return 0 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:23.487 Malloc_STAT 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@895 -- # local bdev_name=Malloc_STAT 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local i 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:23.487 [ 00:08:23.487 { 00:08:23.487 "name": "Malloc_STAT", 00:08:23.487 "aliases": [ 00:08:23.487 "96b2a187-d893-4ab9-8005-a3f4e30c50ce" 00:08:23.487 ], 00:08:23.487 "product_name": "Malloc disk", 00:08:23.487 "block_size": 512, 00:08:23.487 "num_blocks": 262144, 00:08:23.487 "uuid": "96b2a187-d893-4ab9-8005-a3f4e30c50ce", 00:08:23.487 "assigned_rate_limits": { 00:08:23.487 "rw_ios_per_sec": 0, 00:08:23.487 "rw_mbytes_per_sec": 0, 00:08:23.487 "r_mbytes_per_sec": 0, 00:08:23.487 "w_mbytes_per_sec": 0 00:08:23.487 }, 00:08:23.487 "claimed": false, 00:08:23.487 "zoned": false, 00:08:23.487 "supported_io_types": { 00:08:23.487 "read": true, 00:08:23.487 "write": true, 00:08:23.487 "unmap": true, 00:08:23.487 "write_zeroes": true, 00:08:23.487 "flush": true, 00:08:23.487 "reset": true, 00:08:23.487 "compare": false, 00:08:23.487 "compare_and_write": false, 00:08:23.487 "abort": true, 00:08:23.487 "nvme_admin": false, 00:08:23.487 "nvme_io": false 00:08:23.487 }, 00:08:23.487 "memory_domains": [ 00:08:23.487 { 00:08:23.487 "dma_device_id": "system", 00:08:23.487 "dma_device_type": 1 00:08:23.487 }, 00:08:23.487 { 00:08:23.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:23.487 "dma_device_type": 2 00:08:23.487 } 00:08:23.487 ], 00:08:23.487 "driver_specific": {} 00:08:23.487 } 00:08:23.487 ] 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # return 0 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:08:23.487 04:09:11 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:23.487 Running I/O for 10 seconds... 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:25.460 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:08:25.460 "tick_rate": 2700000000, 00:08:25.460 "ticks": 5254192228104540, 00:08:25.460 "bdevs": [ 00:08:25.460 { 00:08:25.460 "name": "Malloc_STAT", 00:08:25.460 "bytes_read": 898675200, 00:08:25.461 "num_read_ops": 219396, 00:08:25.461 "bytes_written": 0, 00:08:25.461 "num_write_ops": 0, 00:08:25.461 "bytes_unmapped": 0, 00:08:25.461 "num_unmap_ops": 0, 00:08:25.461 "bytes_copied": 0, 00:08:25.461 "num_copy_ops": 0, 00:08:25.461 "read_latency_ticks": 2620286426595, 00:08:25.461 "max_read_latency_ticks": 15857883, 00:08:25.461 "min_read_latency_ticks": 438727, 00:08:25.461 "write_latency_ticks": 0, 00:08:25.461 "max_write_latency_ticks": 0, 00:08:25.461 "min_write_latency_ticks": 0, 00:08:25.461 "unmap_latency_ticks": 0, 00:08:25.461 "max_unmap_latency_ticks": 0, 00:08:25.461 "min_unmap_latency_ticks": 0, 00:08:25.461 "copy_latency_ticks": 0, 00:08:25.461 "max_copy_latency_ticks": 0, 00:08:25.461 "min_copy_latency_ticks": 0, 00:08:25.461 "io_error": {} 00:08:25.461 } 00:08:25.461 ] 00:08:25.461 }' 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=219396 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:08:25.461 "tick_rate": 2700000000, 00:08:25.461 "ticks": 5254192372300319, 00:08:25.461 "name": "Malloc_STAT", 00:08:25.461 "channels": [ 00:08:25.461 { 00:08:25.461 "thread_id": 2, 00:08:25.461 "bytes_read": 460324864, 00:08:25.461 "num_read_ops": 112384, 00:08:25.461 "bytes_written": 0, 00:08:25.461 "num_write_ops": 0, 00:08:25.461 "bytes_unmapped": 0, 00:08:25.461 "num_unmap_ops": 0, 00:08:25.461 "bytes_copied": 0, 00:08:25.461 "num_copy_ops": 0, 00:08:25.461 "read_latency_ticks": 1346690924269, 00:08:25.461 "max_read_latency_ticks": 12890493, 00:08:25.461 "min_read_latency_ticks": 9293038, 00:08:25.461 "write_latency_ticks": 0, 00:08:25.461 "max_write_latency_ticks": 0, 00:08:25.461 "min_write_latency_ticks": 0, 00:08:25.461 "unmap_latency_ticks": 0, 00:08:25.461 "max_unmap_latency_ticks": 0, 00:08:25.461 "min_unmap_latency_ticks": 0, 00:08:25.461 "copy_latency_ticks": 0, 00:08:25.461 "max_copy_latency_ticks": 0, 00:08:25.461 "min_copy_latency_ticks": 0 00:08:25.461 }, 00:08:25.461 { 00:08:25.461 "thread_id": 3, 00:08:25.461 "bytes_read": 464519168, 00:08:25.461 "num_read_ops": 113408, 00:08:25.461 "bytes_written": 0, 00:08:25.461 "num_write_ops": 0, 00:08:25.461 "bytes_unmapped": 0, 00:08:25.461 "num_unmap_ops": 0, 00:08:25.461 "bytes_copied": 0, 00:08:25.461 "num_copy_ops": 0, 00:08:25.461 "read_latency_ticks": 1349837328032, 00:08:25.461 "max_read_latency_ticks": 15857883, 00:08:25.461 "min_read_latency_ticks": 9340156, 00:08:25.461 "write_latency_ticks": 0, 00:08:25.461 "max_write_latency_ticks": 0, 00:08:25.461 "min_write_latency_ticks": 0, 00:08:25.461 "unmap_latency_ticks": 0, 00:08:25.461 "max_unmap_latency_ticks": 0, 00:08:25.461 "min_unmap_latency_ticks": 0, 00:08:25.461 "copy_latency_ticks": 0, 00:08:25.461 "max_copy_latency_ticks": 0, 00:08:25.461 "min_copy_latency_ticks": 0 00:08:25.461 } 00:08:25.461 ] 00:08:25.461 }' 00:08:25.461 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=112384 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=112384 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=113408 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=225792 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:08:25.718 "tick_rate": 2700000000, 00:08:25.718 "ticks": 5254192637206307, 00:08:25.718 "bdevs": [ 00:08:25.718 { 00:08:25.718 "name": "Malloc_STAT", 00:08:25.718 "bytes_read": 971026944, 00:08:25.718 "num_read_ops": 237060, 00:08:25.718 "bytes_written": 0, 00:08:25.718 "num_write_ops": 0, 00:08:25.718 "bytes_unmapped": 0, 00:08:25.718 "num_unmap_ops": 0, 00:08:25.718 "bytes_copied": 0, 00:08:25.718 "num_copy_ops": 0, 00:08:25.718 "read_latency_ticks": 2831432726759, 00:08:25.718 "max_read_latency_ticks": 15857883, 00:08:25.718 "min_read_latency_ticks": 438727, 00:08:25.718 "write_latency_ticks": 0, 00:08:25.718 "max_write_latency_ticks": 0, 00:08:25.718 "min_write_latency_ticks": 0, 00:08:25.718 "unmap_latency_ticks": 0, 00:08:25.718 "max_unmap_latency_ticks": 0, 00:08:25.718 "min_unmap_latency_ticks": 0, 00:08:25.718 "copy_latency_ticks": 0, 00:08:25.718 "max_copy_latency_ticks": 0, 00:08:25.718 "min_copy_latency_ticks": 0, 00:08:25.718 "io_error": {} 00:08:25.718 } 00:08:25.718 ] 00:08:25.718 }' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=237060 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 225792 -lt 219396 ']' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 225792 -gt 237060 ']' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:25.718 00:08:25.718 Latency(us) 00:08:25.718 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:25.718 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:08:25.718 Malloc_STAT : 2.11 57559.47 224.84 0.00 0.00 4437.01 1171.15 4781.70 00:08:25.718 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:08:25.718 Malloc_STAT : 2.11 57992.99 226.54 0.00 0.00 4403.92 892.02 5873.97 00:08:25.718 =================================================================================================================== 00:08:25.718 Total : 115552.47 451.38 0.00 0.00 4420.39 892.02 5873.97 00:08:25.718 0 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 3822818 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@946 -- # '[' -z 3822818 ']' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # kill -0 3822818 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # uname 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3822818 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3822818' 00:08:25.718 killing process with pid 3822818 00:08:25.718 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@965 -- # kill 3822818 00:08:25.718 Received shutdown signal, test time was about 2.172459 seconds 00:08:25.718 00:08:25.718 Latency(us) 00:08:25.718 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:25.718 =================================================================================================================== 00:08:25.719 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:08:25.719 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@970 -- # wait 3822818 00:08:25.976 04:09:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:08:25.976 00:08:25.976 real 0m2.980s 00:08:25.976 user 0m5.778s 00:08:25.976 sys 0m0.338s 00:08:25.976 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:25.976 04:09:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:08:25.976 ************************************ 00:08:25.976 END TEST bdev_stat 00:08:25.976 ************************************ 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:08:25.976 04:09:13 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:08:25.976 00:08:25.976 real 1m54.217s 00:08:25.976 user 7m2.146s 00:08:25.976 sys 0m19.417s 00:08:25.976 04:09:13 blockdev_general -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:25.976 04:09:13 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:25.976 ************************************ 00:08:25.976 END TEST blockdev_general 00:08:25.976 ************************************ 00:08:25.976 04:09:13 -- spdk/autotest.sh@186 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:08:25.976 04:09:13 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:25.976 04:09:13 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:25.976 04:09:13 -- common/autotest_common.sh@10 -- # set +x 00:08:26.235 ************************************ 00:08:26.235 START TEST bdev_raid 00:08:26.235 ************************************ 00:08:26.235 04:09:14 bdev_raid -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:08:26.235 * Looking for test storage... 00:08:26.235 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@12 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:26.235 04:09:14 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@14 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@788 -- # trap 'on_error_exit;' ERR 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@790 -- # base_blocklen=512 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@792 -- # uname -s 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@792 -- # '[' Linux = Linux ']' 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@792 -- # modprobe -n nbd 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@793 -- # has_nbd=true 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@794 -- # modprobe nbd 00:08:26.235 04:09:14 bdev_raid -- bdev/bdev_raid.sh@795 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:08:26.235 04:09:14 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:26.235 04:09:14 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:26.235 04:09:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:26.235 ************************************ 00:08:26.235 START TEST raid_function_test_raid0 00:08:26.235 ************************************ 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1121 -- # raid_function_test raid0 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local raid_level=raid0 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # raid_pid=3823304 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 3823304' 00:08:26.235 Process raid pid: 3823304 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@88 -- # waitforlisten 3823304 /var/tmp/spdk-raid.sock 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@827 -- # '[' -z 3823304 ']' 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:26.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:26.235 04:09:14 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:08:26.235 [2024-05-15 04:09:14.143459] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:26.235 [2024-05-15 04:09:14.143544] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:26.235 [2024-05-15 04:09:14.228593] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.493 [2024-05-15 04:09:14.347362] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.493 [2024-05-15 04:09:14.419412] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:26.493 [2024-05-15 04:09:14.419485] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # return 0 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev raid0 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # local raid_level=raid0 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@70 -- # cat 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:08:27.427 [2024-05-15 04:09:15.373653] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:27.427 [2024-05-15 04:09:15.375153] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:27.427 [2024-05-15 04:09:15.375217] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x28daaf0 00:08:27.427 [2024-05-15 04:09:15.375234] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:27.427 [2024-05-15 04:09:15.375423] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x273dde0 00:08:27.427 [2024-05-15 04:09:15.375571] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28daaf0 00:08:27.427 [2024-05-15 04:09:15.375587] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x28daaf0 00:08:27.427 [2024-05-15 04:09:15.375693] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:27.427 Base_1 00:08:27.427 Base_2 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:08:27.427 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:27.685 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:08:27.943 [2024-05-15 04:09:15.862990] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28cfbe0 00:08:27.943 /dev/nbd0 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@865 -- # local i 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # break 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:27.943 1+0 records in 00:08:27.943 1+0 records out 00:08:27.943 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000166105 s, 24.7 MB/s 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # size=4096 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # return 0 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:27.943 04:09:15 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:28.201 { 00:08:28.201 "nbd_device": "/dev/nbd0", 00:08:28.201 "bdev_name": "raid" 00:08:28.201 } 00:08:28.201 ]' 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:28.201 { 00:08:28.201 "nbd_device": "/dev/nbd0", 00:08:28.201 "bdev_name": "raid" 00:08:28.201 } 00:08:28.201 ]' 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:08:28.201 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # count=1 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local blksize 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # blksize=512 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:08:28.202 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:08:28.459 4096+0 records in 00:08:28.459 4096+0 records out 00:08:28.459 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0163591 s, 128 MB/s 00:08:28.459 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:08:28.717 4096+0 records in 00:08:28.717 4096+0 records out 00:08:28.717 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.272755 s, 7.7 MB/s 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:08:28.717 128+0 records in 00:08:28.717 128+0 records out 00:08:28.717 65536 bytes (66 kB, 64 KiB) copied, 0.0002824 s, 232 MB/s 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:08:28.717 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:08:28.717 2035+0 records in 00:08:28.717 2035+0 records out 00:08:28.718 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00371056 s, 281 MB/s 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:08:28.718 456+0 records in 00:08:28.718 456+0 records out 00:08:28.718 233472 bytes (233 kB, 228 KiB) copied, 0.000990336 s, 236 MB/s 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@53 -- # return 0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:28.718 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:28.976 [2024-05-15 04:09:16.786715] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:28.976 04:09:16 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # count=0 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@111 -- # killprocess 3823304 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@946 -- # '[' -z 3823304 ']' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # kill -0 3823304 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # uname 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3823304 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3823304' 00:08:29.234 killing process with pid 3823304 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@965 -- # kill 3823304 00:08:29.234 [2024-05-15 04:09:17.117359] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:29.234 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@970 -- # wait 3823304 00:08:29.234 [2024-05-15 04:09:17.117455] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:29.234 [2024-05-15 04:09:17.117515] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:29.234 [2024-05-15 04:09:17.117531] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28daaf0 name raid, state offline 00:08:29.234 [2024-05-15 04:09:17.139073] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:29.492 04:09:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@113 -- # return 0 00:08:29.492 00:08:29.492 real 0m3.311s 00:08:29.492 user 0m4.550s 00:08:29.492 sys 0m0.941s 00:08:29.492 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:29.492 04:09:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:08:29.492 ************************************ 00:08:29.492 END TEST raid_function_test_raid0 00:08:29.492 ************************************ 00:08:29.492 04:09:17 bdev_raid -- bdev/bdev_raid.sh@796 -- # run_test raid_function_test_concat raid_function_test concat 00:08:29.492 04:09:17 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:08:29.492 04:09:17 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:29.492 04:09:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:29.492 ************************************ 00:08:29.492 START TEST raid_function_test_concat 00:08:29.492 ************************************ 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1121 -- # raid_function_test concat 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local raid_level=concat 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local nbd=/dev/nbd0 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@83 -- # local raid_bdev 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # raid_pid=3823785 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # echo 'Process raid pid: 3823785' 00:08:29.492 Process raid pid: 3823785 00:08:29.492 04:09:17 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@88 -- # waitforlisten 3823785 /var/tmp/spdk-raid.sock 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@827 -- # '[' -z 3823785 ']' 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:29.493 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:29.493 04:09:17 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:08:29.751 [2024-05-15 04:09:17.510991] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:29.751 [2024-05-15 04:09:17.511065] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:29.751 [2024-05-15 04:09:17.587290] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.751 [2024-05-15 04:09:17.696176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.751 [2024-05-15 04:09:17.765738] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:29.751 [2024-05-15 04:09:17.765780] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # return 0 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # configure_raid_bdev concat 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # local raid_level=concat 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@68 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@70 -- # cat 00:08:30.684 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:08:30.941 [2024-05-15 04:09:18.725973] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:30.941 [2024-05-15 04:09:18.727401] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:30.941 [2024-05-15 04:09:18.727468] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x18ceaf0 00:08:30.941 [2024-05-15 04:09:18.727485] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:30.941 [2024-05-15 04:09:18.727675] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1731de0 00:08:30.941 [2024-05-15 04:09:18.727837] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18ceaf0 00:08:30.941 [2024-05-15 04:09:18.727854] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x18ceaf0 00:08:30.941 [2024-05-15 04:09:18.727960] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:30.941 Base_1 00:08:30.941 Base_2 00:08:30.941 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@77 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:08:30.941 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:08:30.941 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # jq -r '.[0]["name"] | select(.)' 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # raid_bdev=raid 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@92 -- # '[' raid = '' ']' 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:31.199 04:09:18 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:08:31.457 [2024-05-15 04:09:19.227354] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1713fd0 00:08:31.457 /dev/nbd0 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@865 -- # local i 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # break 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:08:31.457 1+0 records in 00:08:31.457 1+0 records out 00:08:31.457 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189921 s, 21.6 MB/s 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # size=4096 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # return 0 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:31.457 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:31.715 { 00:08:31.715 "nbd_device": "/dev/nbd0", 00:08:31.715 "bdev_name": "raid" 00:08:31.715 } 00:08:31.715 ]' 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:31.715 { 00:08:31.715 "nbd_device": "/dev/nbd0", 00:08:31.715 "bdev_name": "raid" 00:08:31.715 } 00:08:31.715 ]' 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # count=1 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@99 -- # '[' 1 -ne 1 ']' 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@103 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@17 -- # hash blkdiscard 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # local nbd=/dev/nbd0 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local blksize 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # lsblk -o LOG-SEC /dev/nbd0 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # grep -v LOG-SEC 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # cut -d ' ' -f 5 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # blksize=512 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # local rw_blk_num=4096 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_len=2097152 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # unmap_blk_offs=('0' '1028' '321') 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local unmap_blk_offs 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_nums=('128' '2035' '456') 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_nums 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_off 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_len 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@30 -- # dd if=/dev/urandom of=/raidrandtest bs=512 count=4096 00:08:31.715 4096+0 records in 00:08:31.715 4096+0 records out 00:08:31.715 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0165426 s, 127 MB/s 00:08:31.715 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:08:31.972 4096+0 records in 00:08:31.972 4096+0 records out 00:08:31.972 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.231156 s, 9.1 MB/s 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # blockdev --flushbufs /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@35 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i = 0 )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=65536 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:08:31.972 128+0 records in 00:08:31.972 128+0 records out 00:08:31.972 65536 bytes (66 kB, 64 KiB) copied, 0.000294527 s, 223 MB/s 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=526336 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=1041920 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:08:31.972 2035+0 records in 00:08:31.972 2035+0 records out 00:08:31.972 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00366732 s, 284 MB/s 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # unmap_off=164352 00:08:31.972 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_len=233472 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@42 -- # dd if=/dev/zero of=/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:08:31.973 456+0 records in 00:08:31.973 456+0 records out 00:08:31.973 233472 bytes (233 kB, 228 KiB) copied, 0.000607882 s, 384 MB/s 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@45 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blockdev --flushbufs /dev/nbd0 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@49 -- # cmp -b -n 2097152 /raidrandtest /dev/nbd0 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i++ )) 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@37 -- # (( i < 3 )) 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@53 -- # return 0 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:31.973 04:09:19 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:08:32.230 [2024-05-15 04:09:20.146964] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:08:32.230 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # count=0 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@107 -- # '[' 0 -ne 0 ']' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@111 -- # killprocess 3823785 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@946 -- # '[' -z 3823785 ']' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # kill -0 3823785 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # uname 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3823785 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3823785' 00:08:32.487 killing process with pid 3823785 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@965 -- # kill 3823785 00:08:32.487 [2024-05-15 04:09:20.470293] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:32.487 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@970 -- # wait 3823785 00:08:32.487 [2024-05-15 04:09:20.470383] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:32.487 [2024-05-15 04:09:20.470439] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:32.487 [2024-05-15 04:09:20.470461] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ceaf0 name raid, state offline 00:08:32.487 [2024-05-15 04:09:20.490708] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:32.744 04:09:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@113 -- # return 0 00:08:32.744 00:08:32.744 real 0m3.293s 00:08:32.744 user 0m4.506s 00:08:32.744 sys 0m1.000s 00:08:32.744 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:32.744 04:09:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:08:32.744 ************************************ 00:08:32.744 END TEST raid_function_test_concat 00:08:32.744 ************************************ 00:08:33.001 04:09:20 bdev_raid -- bdev/bdev_raid.sh@799 -- # run_test raid0_resize_test raid0_resize_test 00:08:33.001 04:09:20 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:33.001 04:09:20 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:33.001 04:09:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:33.001 ************************************ 00:08:33.001 START TEST raid0_resize_test 00:08:33.001 ************************************ 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1121 -- # raid0_resize_test 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # raid_pid=3824268 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # echo 'Process raid pid: 3824268' 00:08:33.001 Process raid pid: 3824268 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # waitforlisten 3824268 /var/tmp/spdk-raid.sock 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@827 -- # '[' -z 3824268 ']' 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:33.001 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:33.001 04:09:20 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:08:33.001 [2024-05-15 04:09:20.861673] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:33.001 [2024-05-15 04:09:20.861755] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:33.001 [2024-05-15 04:09:20.945654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.258 [2024-05-15 04:09:21.062786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.258 [2024-05-15 04:09:21.136450] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:33.258 [2024-05-15 04:09:21.136499] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:33.928 04:09:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:33.928 04:09:21 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # return 0 00:08:33.928 04:09:21 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:08:34.184 Base_1 00:08:34.184 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:08:34.440 Base_2 00:08:34.440 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@363 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:08:34.697 [2024-05-15 04:09:22.476988] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:08:34.697 [2024-05-15 04:09:22.478450] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:08:34.697 [2024-05-15 04:09:22.478507] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x974700 00:08:34.697 [2024-05-15 04:09:22.478523] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:34.697 [2024-05-15 04:09:22.478722] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x976180 00:08:34.697 [2024-05-15 04:09:22.478847] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x974700 00:08:34.697 [2024-05-15 04:09:22.478863] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x974700 00:08:34.697 [2024-05-15 04:09:22.478968] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:34.697 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@366 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:08:34.955 [2024-05-15 04:09:22.729615] bdev_raid.c:2232:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:08:34.955 [2024-05-15 04:09:22.729637] bdev_raid.c:2245:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:08:34.955 true 00:08:34.955 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:08:34.955 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # jq '.[].num_blocks' 00:08:34.955 [2024-05-15 04:09:22.970413] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:35.212 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # blkcnt=131072 00:08:35.212 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # raid_size_mb=64 00:08:35.212 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # '[' 64 '!=' 64 ']' 00:08:35.212 04:09:22 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:08:35.212 [2024-05-15 04:09:23.206883] bdev_raid.c:2232:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:08:35.212 [2024-05-15 04:09:23.206909] bdev_raid.c:2245:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:08:35.213 [2024-05-15 04:09:23.206941] bdev_raid.c:2259:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:08:35.213 true 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # jq '.[].num_blocks' 00:08:35.470 [2024-05-15 04:09:23.455679] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # blkcnt=262144 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # raid_size_mb=128 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@382 -- # '[' 128 '!=' 128 ']' 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # killprocess 3824268 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@946 -- # '[' -z 3824268 ']' 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # kill -0 3824268 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # uname 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:35.470 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3824268 00:08:35.728 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:35.728 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:35.728 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3824268' 00:08:35.728 killing process with pid 3824268 00:08:35.728 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@965 -- # kill 3824268 00:08:35.728 [2024-05-15 04:09:23.499240] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:35.728 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@970 -- # wait 3824268 00:08:35.728 [2024-05-15 04:09:23.499328] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:35.728 [2024-05-15 04:09:23.499389] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:08:35.728 [2024-05-15 04:09:23.499405] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x974700 name Raid, state offline 00:08:35.728 [2024-05-15 04:09:23.500918] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:35.985 04:09:23 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@389 -- # return 0 00:08:35.985 00:08:35.985 real 0m2.945s 00:08:35.985 user 0m4.571s 00:08:35.985 sys 0m0.538s 00:08:35.985 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:35.985 04:09:23 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:08:35.985 ************************************ 00:08:35.985 END TEST raid0_resize_test 00:08:35.985 ************************************ 00:08:35.985 04:09:23 bdev_raid -- bdev/bdev_raid.sh@801 -- # for n in {2..4} 00:08:35.985 04:09:23 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:08:35.985 04:09:23 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:08:35.985 04:09:23 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:08:35.985 04:09:23 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:35.985 04:09:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:35.985 ************************************ 00:08:35.985 START TEST raid_state_function_test 00:08:35.985 ************************************ 00:08:35.985 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 false 00:08:35.985 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:08:35.985 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:08:35.985 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3824644 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3824644' 00:08:35.986 Process raid pid: 3824644 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3824644 /var/tmp/spdk-raid.sock 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3824644 ']' 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:35.986 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:35.986 04:09:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:35.986 [2024-05-15 04:09:23.859786] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:35.986 [2024-05-15 04:09:23.859878] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:35.986 [2024-05-15 04:09:23.938215] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.243 [2024-05-15 04:09:24.047298] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.243 [2024-05-15 04:09:24.120518] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:36.243 [2024-05-15 04:09:24.120564] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:36.807 04:09:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:36.807 04:09:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:08:36.807 04:09:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:37.064 [2024-05-15 04:09:25.040365] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:08:37.064 [2024-05-15 04:09:25.040405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:08:37.064 [2024-05-15 04:09:25.040415] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:37.064 [2024-05-15 04:09:25.040425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:37.064 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:37.322 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:37.322 "name": "Existed_Raid", 00:08:37.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:37.322 "strip_size_kb": 64, 00:08:37.322 "state": "configuring", 00:08:37.322 "raid_level": "raid0", 00:08:37.322 "superblock": false, 00:08:37.322 "num_base_bdevs": 2, 00:08:37.322 "num_base_bdevs_discovered": 0, 00:08:37.322 "num_base_bdevs_operational": 2, 00:08:37.322 "base_bdevs_list": [ 00:08:37.322 { 00:08:37.322 "name": "BaseBdev1", 00:08:37.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:37.322 "is_configured": false, 00:08:37.322 "data_offset": 0, 00:08:37.322 "data_size": 0 00:08:37.322 }, 00:08:37.322 { 00:08:37.322 "name": "BaseBdev2", 00:08:37.322 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:37.322 "is_configured": false, 00:08:37.322 "data_offset": 0, 00:08:37.322 "data_size": 0 00:08:37.322 } 00:08:37.322 ] 00:08:37.322 }' 00:08:37.322 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:37.322 04:09:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:37.888 04:09:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:08:38.146 [2024-05-15 04:09:26.071002] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:08:38.146 [2024-05-15 04:09:26.071032] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6e000 name Existed_Raid, state configuring 00:08:38.146 04:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:38.403 [2024-05-15 04:09:26.315651] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:08:38.403 [2024-05-15 04:09:26.315699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:08:38.403 [2024-05-15 04:09:26.315710] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:38.403 [2024-05-15 04:09:26.315723] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:38.403 04:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:08:38.661 [2024-05-15 04:09:26.564740] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:08:38.661 BaseBdev1 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:38.661 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:08:38.919 04:09:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:08:39.176 [ 00:08:39.176 { 00:08:39.176 "name": "BaseBdev1", 00:08:39.176 "aliases": [ 00:08:39.176 "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c" 00:08:39.176 ], 00:08:39.176 "product_name": "Malloc disk", 00:08:39.176 "block_size": 512, 00:08:39.176 "num_blocks": 65536, 00:08:39.176 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:39.176 "assigned_rate_limits": { 00:08:39.176 "rw_ios_per_sec": 0, 00:08:39.176 "rw_mbytes_per_sec": 0, 00:08:39.176 "r_mbytes_per_sec": 0, 00:08:39.176 "w_mbytes_per_sec": 0 00:08:39.176 }, 00:08:39.176 "claimed": true, 00:08:39.176 "claim_type": "exclusive_write", 00:08:39.176 "zoned": false, 00:08:39.176 "supported_io_types": { 00:08:39.176 "read": true, 00:08:39.176 "write": true, 00:08:39.176 "unmap": true, 00:08:39.176 "write_zeroes": true, 00:08:39.176 "flush": true, 00:08:39.176 "reset": true, 00:08:39.176 "compare": false, 00:08:39.176 "compare_and_write": false, 00:08:39.176 "abort": true, 00:08:39.176 "nvme_admin": false, 00:08:39.176 "nvme_io": false 00:08:39.176 }, 00:08:39.176 "memory_domains": [ 00:08:39.176 { 00:08:39.176 "dma_device_id": "system", 00:08:39.176 "dma_device_type": 1 00:08:39.176 }, 00:08:39.176 { 00:08:39.176 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:39.176 "dma_device_type": 2 00:08:39.176 } 00:08:39.176 ], 00:08:39.176 "driver_specific": {} 00:08:39.176 } 00:08:39.176 ] 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:39.176 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:39.177 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:39.177 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:39.433 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:39.433 "name": "Existed_Raid", 00:08:39.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:39.433 "strip_size_kb": 64, 00:08:39.433 "state": "configuring", 00:08:39.433 "raid_level": "raid0", 00:08:39.433 "superblock": false, 00:08:39.433 "num_base_bdevs": 2, 00:08:39.433 "num_base_bdevs_discovered": 1, 00:08:39.433 "num_base_bdevs_operational": 2, 00:08:39.433 "base_bdevs_list": [ 00:08:39.433 { 00:08:39.433 "name": "BaseBdev1", 00:08:39.433 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:39.433 "is_configured": true, 00:08:39.433 "data_offset": 0, 00:08:39.433 "data_size": 65536 00:08:39.433 }, 00:08:39.433 { 00:08:39.433 "name": "BaseBdev2", 00:08:39.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:39.433 "is_configured": false, 00:08:39.433 "data_offset": 0, 00:08:39.433 "data_size": 0 00:08:39.433 } 00:08:39.433 ] 00:08:39.433 }' 00:08:39.433 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:39.433 04:09:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:39.999 04:09:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:08:40.257 [2024-05-15 04:09:28.096766] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:08:40.257 [2024-05-15 04:09:28.096811] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6d8f0 name Existed_Raid, state configuring 00:08:40.257 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:40.515 [2024-05-15 04:09:28.341455] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:08:40.515 [2024-05-15 04:09:28.343025] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:40.515 [2024-05-15 04:09:28.343067] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:40.515 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:40.775 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:40.775 "name": "Existed_Raid", 00:08:40.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:40.775 "strip_size_kb": 64, 00:08:40.775 "state": "configuring", 00:08:40.775 "raid_level": "raid0", 00:08:40.775 "superblock": false, 00:08:40.775 "num_base_bdevs": 2, 00:08:40.775 "num_base_bdevs_discovered": 1, 00:08:40.775 "num_base_bdevs_operational": 2, 00:08:40.775 "base_bdevs_list": [ 00:08:40.775 { 00:08:40.775 "name": "BaseBdev1", 00:08:40.775 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:40.775 "is_configured": true, 00:08:40.775 "data_offset": 0, 00:08:40.775 "data_size": 65536 00:08:40.775 }, 00:08:40.775 { 00:08:40.775 "name": "BaseBdev2", 00:08:40.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:40.775 "is_configured": false, 00:08:40.775 "data_offset": 0, 00:08:40.775 "data_size": 0 00:08:40.775 } 00:08:40.775 ] 00:08:40.775 }' 00:08:40.775 04:09:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:40.775 04:09:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:41.388 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:08:41.646 [2024-05-15 04:09:29.422138] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:08:41.646 [2024-05-15 04:09:29.422179] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b6e6e0 00:08:41.646 [2024-05-15 04:09:29.422190] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:08:41.646 [2024-05-15 04:09:29.422393] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b64e60 00:08:41.646 [2024-05-15 04:09:29.422549] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b6e6e0 00:08:41.646 [2024-05-15 04:09:29.422565] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b6e6e0 00:08:41.646 [2024-05-15 04:09:29.422785] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:41.646 BaseBdev2 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:41.646 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:08:41.904 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:08:41.904 [ 00:08:41.904 { 00:08:41.904 "name": "BaseBdev2", 00:08:41.904 "aliases": [ 00:08:41.904 "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9" 00:08:41.904 ], 00:08:41.904 "product_name": "Malloc disk", 00:08:41.904 "block_size": 512, 00:08:41.904 "num_blocks": 65536, 00:08:41.904 "uuid": "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9", 00:08:41.904 "assigned_rate_limits": { 00:08:41.904 "rw_ios_per_sec": 0, 00:08:41.904 "rw_mbytes_per_sec": 0, 00:08:41.904 "r_mbytes_per_sec": 0, 00:08:41.904 "w_mbytes_per_sec": 0 00:08:41.904 }, 00:08:41.904 "claimed": true, 00:08:41.904 "claim_type": "exclusive_write", 00:08:41.904 "zoned": false, 00:08:41.904 "supported_io_types": { 00:08:41.904 "read": true, 00:08:41.904 "write": true, 00:08:41.904 "unmap": true, 00:08:41.904 "write_zeroes": true, 00:08:41.904 "flush": true, 00:08:41.904 "reset": true, 00:08:41.904 "compare": false, 00:08:41.904 "compare_and_write": false, 00:08:41.904 "abort": true, 00:08:41.904 "nvme_admin": false, 00:08:41.904 "nvme_io": false 00:08:41.904 }, 00:08:41.904 "memory_domains": [ 00:08:41.904 { 00:08:41.904 "dma_device_id": "system", 00:08:41.904 "dma_device_type": 1 00:08:41.904 }, 00:08:41.904 { 00:08:41.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:41.904 "dma_device_type": 2 00:08:41.904 } 00:08:41.904 ], 00:08:41.904 "driver_specific": {} 00:08:41.904 } 00:08:41.904 ] 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:42.162 04:09:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:42.420 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:42.420 "name": "Existed_Raid", 00:08:42.420 "uuid": "27a13834-efbe-4a92-9d6f-b03a0c2dc626", 00:08:42.420 "strip_size_kb": 64, 00:08:42.420 "state": "online", 00:08:42.420 "raid_level": "raid0", 00:08:42.420 "superblock": false, 00:08:42.420 "num_base_bdevs": 2, 00:08:42.420 "num_base_bdevs_discovered": 2, 00:08:42.420 "num_base_bdevs_operational": 2, 00:08:42.420 "base_bdevs_list": [ 00:08:42.420 { 00:08:42.420 "name": "BaseBdev1", 00:08:42.420 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:42.420 "is_configured": true, 00:08:42.420 "data_offset": 0, 00:08:42.420 "data_size": 65536 00:08:42.420 }, 00:08:42.420 { 00:08:42.420 "name": "BaseBdev2", 00:08:42.420 "uuid": "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9", 00:08:42.420 "is_configured": true, 00:08:42.421 "data_offset": 0, 00:08:42.421 "data_size": 65536 00:08:42.421 } 00:08:42.421 ] 00:08:42.421 }' 00:08:42.421 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:42.421 04:09:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:08:42.985 [2024-05-15 04:09:30.946424] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:08:42.985 "name": "Existed_Raid", 00:08:42.985 "aliases": [ 00:08:42.985 "27a13834-efbe-4a92-9d6f-b03a0c2dc626" 00:08:42.985 ], 00:08:42.985 "product_name": "Raid Volume", 00:08:42.985 "block_size": 512, 00:08:42.985 "num_blocks": 131072, 00:08:42.985 "uuid": "27a13834-efbe-4a92-9d6f-b03a0c2dc626", 00:08:42.985 "assigned_rate_limits": { 00:08:42.985 "rw_ios_per_sec": 0, 00:08:42.985 "rw_mbytes_per_sec": 0, 00:08:42.985 "r_mbytes_per_sec": 0, 00:08:42.985 "w_mbytes_per_sec": 0 00:08:42.985 }, 00:08:42.985 "claimed": false, 00:08:42.985 "zoned": false, 00:08:42.985 "supported_io_types": { 00:08:42.985 "read": true, 00:08:42.985 "write": true, 00:08:42.985 "unmap": true, 00:08:42.985 "write_zeroes": true, 00:08:42.985 "flush": true, 00:08:42.985 "reset": true, 00:08:42.985 "compare": false, 00:08:42.985 "compare_and_write": false, 00:08:42.985 "abort": false, 00:08:42.985 "nvme_admin": false, 00:08:42.985 "nvme_io": false 00:08:42.985 }, 00:08:42.985 "memory_domains": [ 00:08:42.985 { 00:08:42.985 "dma_device_id": "system", 00:08:42.985 "dma_device_type": 1 00:08:42.985 }, 00:08:42.985 { 00:08:42.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:42.985 "dma_device_type": 2 00:08:42.985 }, 00:08:42.985 { 00:08:42.985 "dma_device_id": "system", 00:08:42.985 "dma_device_type": 1 00:08:42.985 }, 00:08:42.985 { 00:08:42.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:42.985 "dma_device_type": 2 00:08:42.985 } 00:08:42.985 ], 00:08:42.985 "driver_specific": { 00:08:42.985 "raid": { 00:08:42.985 "uuid": "27a13834-efbe-4a92-9d6f-b03a0c2dc626", 00:08:42.985 "strip_size_kb": 64, 00:08:42.985 "state": "online", 00:08:42.985 "raid_level": "raid0", 00:08:42.985 "superblock": false, 00:08:42.985 "num_base_bdevs": 2, 00:08:42.985 "num_base_bdevs_discovered": 2, 00:08:42.985 "num_base_bdevs_operational": 2, 00:08:42.985 "base_bdevs_list": [ 00:08:42.985 { 00:08:42.985 "name": "BaseBdev1", 00:08:42.985 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:42.985 "is_configured": true, 00:08:42.985 "data_offset": 0, 00:08:42.985 "data_size": 65536 00:08:42.985 }, 00:08:42.985 { 00:08:42.985 "name": "BaseBdev2", 00:08:42.985 "uuid": "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9", 00:08:42.985 "is_configured": true, 00:08:42.985 "data_offset": 0, 00:08:42.985 "data_size": 65536 00:08:42.985 } 00:08:42.985 ] 00:08:42.985 } 00:08:42.985 } 00:08:42.985 }' 00:08:42.985 04:09:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:08:43.243 BaseBdev2' 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:08:43.243 "name": "BaseBdev1", 00:08:43.243 "aliases": [ 00:08:43.243 "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c" 00:08:43.243 ], 00:08:43.243 "product_name": "Malloc disk", 00:08:43.243 "block_size": 512, 00:08:43.243 "num_blocks": 65536, 00:08:43.243 "uuid": "f8348cc5-ff0e-41f1-9ffe-375f19e62f1c", 00:08:43.243 "assigned_rate_limits": { 00:08:43.243 "rw_ios_per_sec": 0, 00:08:43.243 "rw_mbytes_per_sec": 0, 00:08:43.243 "r_mbytes_per_sec": 0, 00:08:43.243 "w_mbytes_per_sec": 0 00:08:43.243 }, 00:08:43.243 "claimed": true, 00:08:43.243 "claim_type": "exclusive_write", 00:08:43.243 "zoned": false, 00:08:43.243 "supported_io_types": { 00:08:43.243 "read": true, 00:08:43.243 "write": true, 00:08:43.243 "unmap": true, 00:08:43.243 "write_zeroes": true, 00:08:43.243 "flush": true, 00:08:43.243 "reset": true, 00:08:43.243 "compare": false, 00:08:43.243 "compare_and_write": false, 00:08:43.243 "abort": true, 00:08:43.243 "nvme_admin": false, 00:08:43.243 "nvme_io": false 00:08:43.243 }, 00:08:43.243 "memory_domains": [ 00:08:43.243 { 00:08:43.243 "dma_device_id": "system", 00:08:43.243 "dma_device_type": 1 00:08:43.243 }, 00:08:43.243 { 00:08:43.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:43.243 "dma_device_type": 2 00:08:43.243 } 00:08:43.243 ], 00:08:43.243 "driver_specific": {} 00:08:43.243 }' 00:08:43.243 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:43.501 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:43.758 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:08:43.758 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:08:43.758 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:08:43.758 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:08:44.015 "name": "BaseBdev2", 00:08:44.015 "aliases": [ 00:08:44.015 "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9" 00:08:44.015 ], 00:08:44.015 "product_name": "Malloc disk", 00:08:44.015 "block_size": 512, 00:08:44.015 "num_blocks": 65536, 00:08:44.015 "uuid": "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9", 00:08:44.015 "assigned_rate_limits": { 00:08:44.015 "rw_ios_per_sec": 0, 00:08:44.015 "rw_mbytes_per_sec": 0, 00:08:44.015 "r_mbytes_per_sec": 0, 00:08:44.015 "w_mbytes_per_sec": 0 00:08:44.015 }, 00:08:44.015 "claimed": true, 00:08:44.015 "claim_type": "exclusive_write", 00:08:44.015 "zoned": false, 00:08:44.015 "supported_io_types": { 00:08:44.015 "read": true, 00:08:44.015 "write": true, 00:08:44.015 "unmap": true, 00:08:44.015 "write_zeroes": true, 00:08:44.015 "flush": true, 00:08:44.015 "reset": true, 00:08:44.015 "compare": false, 00:08:44.015 "compare_and_write": false, 00:08:44.015 "abort": true, 00:08:44.015 "nvme_admin": false, 00:08:44.015 "nvme_io": false 00:08:44.015 }, 00:08:44.015 "memory_domains": [ 00:08:44.015 { 00:08:44.015 "dma_device_id": "system", 00:08:44.015 "dma_device_type": 1 00:08:44.015 }, 00:08:44.015 { 00:08:44.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:44.015 "dma_device_type": 2 00:08:44.015 } 00:08:44.015 ], 00:08:44.015 "driver_specific": {} 00:08:44.015 }' 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:44.015 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:44.016 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:08:44.016 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:44.016 04:09:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:44.016 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:08:44.016 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:44.272 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:44.272 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:08:44.272 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:08:44.530 [2024-05-15 04:09:32.321907] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:08:44.530 [2024-05-15 04:09:32.321939] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:08:44.530 [2024-05-15 04:09:32.321987] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:44.530 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:44.788 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:44.788 "name": "Existed_Raid", 00:08:44.788 "uuid": "27a13834-efbe-4a92-9d6f-b03a0c2dc626", 00:08:44.788 "strip_size_kb": 64, 00:08:44.788 "state": "offline", 00:08:44.788 "raid_level": "raid0", 00:08:44.788 "superblock": false, 00:08:44.788 "num_base_bdevs": 2, 00:08:44.788 "num_base_bdevs_discovered": 1, 00:08:44.788 "num_base_bdevs_operational": 1, 00:08:44.788 "base_bdevs_list": [ 00:08:44.788 { 00:08:44.788 "name": null, 00:08:44.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:44.788 "is_configured": false, 00:08:44.788 "data_offset": 0, 00:08:44.788 "data_size": 65536 00:08:44.788 }, 00:08:44.788 { 00:08:44.788 "name": "BaseBdev2", 00:08:44.788 "uuid": "d3fc0ec5-4b32-4e80-9fd1-be27709fa9e9", 00:08:44.788 "is_configured": true, 00:08:44.788 "data_offset": 0, 00:08:44.788 "data_size": 65536 00:08:44.788 } 00:08:44.788 ] 00:08:44.788 }' 00:08:44.788 04:09:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:44.788 04:09:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:45.354 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:08:45.354 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:08:45.354 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:45.354 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:08:45.612 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:08:45.612 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:08:45.612 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:08:45.612 [2024-05-15 04:09:33.603411] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:08:45.612 [2024-05-15 04:09:33.603470] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b6e6e0 name Existed_Raid, state offline 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3824644 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3824644 ']' 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3824644 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:45.870 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3824644 00:08:46.128 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:46.128 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:46.128 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3824644' 00:08:46.128 killing process with pid 3824644 00:08:46.128 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3824644 00:08:46.128 [2024-05-15 04:09:33.900129] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:46.128 04:09:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3824644 00:08:46.128 [2024-05-15 04:09:33.901209] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:08:46.387 00:08:46.387 real 0m10.361s 00:08:46.387 user 0m18.710s 00:08:46.387 sys 0m1.491s 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:08:46.387 ************************************ 00:08:46.387 END TEST raid_state_function_test 00:08:46.387 ************************************ 00:08:46.387 04:09:34 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:08:46.387 04:09:34 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:08:46.387 04:09:34 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:46.387 04:09:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:46.387 ************************************ 00:08:46.387 START TEST raid_state_function_test_sb 00:08:46.387 ************************************ 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 2 true 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3826134 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3826134' 00:08:46.387 Process raid pid: 3826134 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3826134 /var/tmp/spdk-raid.sock 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3826134 ']' 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:46.387 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:46.387 04:09:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:46.387 [2024-05-15 04:09:34.270358] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:46.387 [2024-05-15 04:09:34.270425] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:46.387 [2024-05-15 04:09:34.352975] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.645 [2024-05-15 04:09:34.470432] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.645 [2024-05-15 04:09:34.534991] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:46.645 [2024-05-15 04:09:34.535034] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:47.209 04:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:47.210 04:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:08:47.210 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:47.468 [2024-05-15 04:09:35.426387] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:08:47.468 [2024-05-15 04:09:35.426428] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:08:47.468 [2024-05-15 04:09:35.426459] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:47.468 [2024-05-15 04:09:35.426474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:47.468 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:47.732 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:47.732 "name": "Existed_Raid", 00:08:47.732 "uuid": "1ed733fd-80a1-4d85-9d21-647ca89e261d", 00:08:47.732 "strip_size_kb": 64, 00:08:47.732 "state": "configuring", 00:08:47.732 "raid_level": "raid0", 00:08:47.732 "superblock": true, 00:08:47.732 "num_base_bdevs": 2, 00:08:47.732 "num_base_bdevs_discovered": 0, 00:08:47.732 "num_base_bdevs_operational": 2, 00:08:47.732 "base_bdevs_list": [ 00:08:47.732 { 00:08:47.732 "name": "BaseBdev1", 00:08:47.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:47.732 "is_configured": false, 00:08:47.732 "data_offset": 0, 00:08:47.732 "data_size": 0 00:08:47.732 }, 00:08:47.732 { 00:08:47.732 "name": "BaseBdev2", 00:08:47.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:47.732 "is_configured": false, 00:08:47.732 "data_offset": 0, 00:08:47.732 "data_size": 0 00:08:47.732 } 00:08:47.732 ] 00:08:47.732 }' 00:08:47.732 04:09:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:47.732 04:09:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:48.297 04:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:08:48.555 [2024-05-15 04:09:36.457004] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:08:48.555 [2024-05-15 04:09:36.457034] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1457000 name Existed_Raid, state configuring 00:08:48.555 04:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:48.813 [2024-05-15 04:09:36.693647] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:08:48.813 [2024-05-15 04:09:36.693686] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:08:48.813 [2024-05-15 04:09:36.693698] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:48.813 [2024-05-15 04:09:36.693711] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:48.813 04:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:08:49.071 [2024-05-15 04:09:36.950085] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:08:49.071 BaseBdev1 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:49.071 04:09:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:08:49.329 04:09:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:08:49.587 [ 00:08:49.587 { 00:08:49.587 "name": "BaseBdev1", 00:08:49.587 "aliases": [ 00:08:49.587 "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8" 00:08:49.587 ], 00:08:49.587 "product_name": "Malloc disk", 00:08:49.587 "block_size": 512, 00:08:49.587 "num_blocks": 65536, 00:08:49.587 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:49.587 "assigned_rate_limits": { 00:08:49.587 "rw_ios_per_sec": 0, 00:08:49.587 "rw_mbytes_per_sec": 0, 00:08:49.587 "r_mbytes_per_sec": 0, 00:08:49.587 "w_mbytes_per_sec": 0 00:08:49.587 }, 00:08:49.587 "claimed": true, 00:08:49.587 "claim_type": "exclusive_write", 00:08:49.587 "zoned": false, 00:08:49.587 "supported_io_types": { 00:08:49.587 "read": true, 00:08:49.587 "write": true, 00:08:49.587 "unmap": true, 00:08:49.587 "write_zeroes": true, 00:08:49.587 "flush": true, 00:08:49.587 "reset": true, 00:08:49.587 "compare": false, 00:08:49.587 "compare_and_write": false, 00:08:49.587 "abort": true, 00:08:49.587 "nvme_admin": false, 00:08:49.587 "nvme_io": false 00:08:49.587 }, 00:08:49.587 "memory_domains": [ 00:08:49.587 { 00:08:49.587 "dma_device_id": "system", 00:08:49.587 "dma_device_type": 1 00:08:49.587 }, 00:08:49.587 { 00:08:49.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:49.587 "dma_device_type": 2 00:08:49.587 } 00:08:49.587 ], 00:08:49.587 "driver_specific": {} 00:08:49.587 } 00:08:49.587 ] 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:49.587 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:49.846 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:49.846 "name": "Existed_Raid", 00:08:49.846 "uuid": "36a81984-bc93-457b-b31a-6e57ff84dd00", 00:08:49.846 "strip_size_kb": 64, 00:08:49.846 "state": "configuring", 00:08:49.846 "raid_level": "raid0", 00:08:49.846 "superblock": true, 00:08:49.846 "num_base_bdevs": 2, 00:08:49.846 "num_base_bdevs_discovered": 1, 00:08:49.846 "num_base_bdevs_operational": 2, 00:08:49.846 "base_bdevs_list": [ 00:08:49.846 { 00:08:49.846 "name": "BaseBdev1", 00:08:49.846 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:49.846 "is_configured": true, 00:08:49.846 "data_offset": 2048, 00:08:49.846 "data_size": 63488 00:08:49.846 }, 00:08:49.846 { 00:08:49.846 "name": "BaseBdev2", 00:08:49.846 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:49.846 "is_configured": false, 00:08:49.846 "data_offset": 0, 00:08:49.846 "data_size": 0 00:08:49.846 } 00:08:49.846 ] 00:08:49.846 }' 00:08:49.846 04:09:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:49.846 04:09:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:50.411 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:08:50.669 [2024-05-15 04:09:38.498203] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:08:50.669 [2024-05-15 04:09:38.498251] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14568f0 name Existed_Raid, state configuring 00:08:50.669 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:08:50.927 [2024-05-15 04:09:38.738906] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:08:50.927 [2024-05-15 04:09:38.740318] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:08:50.927 [2024-05-15 04:09:38.740348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:08:50.927 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:50.928 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:51.185 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:51.185 "name": "Existed_Raid", 00:08:51.185 "uuid": "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf", 00:08:51.185 "strip_size_kb": 64, 00:08:51.185 "state": "configuring", 00:08:51.185 "raid_level": "raid0", 00:08:51.185 "superblock": true, 00:08:51.185 "num_base_bdevs": 2, 00:08:51.185 "num_base_bdevs_discovered": 1, 00:08:51.185 "num_base_bdevs_operational": 2, 00:08:51.185 "base_bdevs_list": [ 00:08:51.185 { 00:08:51.185 "name": "BaseBdev1", 00:08:51.185 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:51.185 "is_configured": true, 00:08:51.185 "data_offset": 2048, 00:08:51.185 "data_size": 63488 00:08:51.185 }, 00:08:51.185 { 00:08:51.185 "name": "BaseBdev2", 00:08:51.185 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:51.185 "is_configured": false, 00:08:51.185 "data_offset": 0, 00:08:51.186 "data_size": 0 00:08:51.186 } 00:08:51.186 ] 00:08:51.186 }' 00:08:51.186 04:09:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:51.186 04:09:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:51.750 04:09:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:08:52.077 [2024-05-15 04:09:39.779932] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:08:52.078 [2024-05-15 04:09:39.780173] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x14576e0 00:08:52.078 [2024-05-15 04:09:39.780204] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:08:52.078 [2024-05-15 04:09:39.780346] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1458a30 00:08:52.078 [2024-05-15 04:09:39.780469] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14576e0 00:08:52.078 [2024-05-15 04:09:39.780483] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14576e0 00:08:52.078 [2024-05-15 04:09:39.780573] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:52.078 BaseBdev2 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:08:52.078 04:09:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:08:52.078 04:09:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:08:52.336 [ 00:08:52.336 { 00:08:52.336 "name": "BaseBdev2", 00:08:52.336 "aliases": [ 00:08:52.336 "dbc6e2dd-108f-454f-8f35-41b056cd6ef4" 00:08:52.336 ], 00:08:52.336 "product_name": "Malloc disk", 00:08:52.336 "block_size": 512, 00:08:52.336 "num_blocks": 65536, 00:08:52.336 "uuid": "dbc6e2dd-108f-454f-8f35-41b056cd6ef4", 00:08:52.336 "assigned_rate_limits": { 00:08:52.336 "rw_ios_per_sec": 0, 00:08:52.336 "rw_mbytes_per_sec": 0, 00:08:52.336 "r_mbytes_per_sec": 0, 00:08:52.336 "w_mbytes_per_sec": 0 00:08:52.336 }, 00:08:52.336 "claimed": true, 00:08:52.336 "claim_type": "exclusive_write", 00:08:52.336 "zoned": false, 00:08:52.336 "supported_io_types": { 00:08:52.336 "read": true, 00:08:52.336 "write": true, 00:08:52.336 "unmap": true, 00:08:52.336 "write_zeroes": true, 00:08:52.336 "flush": true, 00:08:52.336 "reset": true, 00:08:52.336 "compare": false, 00:08:52.336 "compare_and_write": false, 00:08:52.336 "abort": true, 00:08:52.336 "nvme_admin": false, 00:08:52.336 "nvme_io": false 00:08:52.336 }, 00:08:52.336 "memory_domains": [ 00:08:52.336 { 00:08:52.336 "dma_device_id": "system", 00:08:52.336 "dma_device_type": 1 00:08:52.336 }, 00:08:52.336 { 00:08:52.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:52.336 "dma_device_type": 2 00:08:52.336 } 00:08:52.336 ], 00:08:52.336 "driver_specific": {} 00:08:52.336 } 00:08:52.336 ] 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:52.336 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:52.595 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:52.595 "name": "Existed_Raid", 00:08:52.595 "uuid": "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf", 00:08:52.595 "strip_size_kb": 64, 00:08:52.595 "state": "online", 00:08:52.595 "raid_level": "raid0", 00:08:52.595 "superblock": true, 00:08:52.595 "num_base_bdevs": 2, 00:08:52.595 "num_base_bdevs_discovered": 2, 00:08:52.595 "num_base_bdevs_operational": 2, 00:08:52.595 "base_bdevs_list": [ 00:08:52.595 { 00:08:52.595 "name": "BaseBdev1", 00:08:52.595 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:52.595 "is_configured": true, 00:08:52.595 "data_offset": 2048, 00:08:52.595 "data_size": 63488 00:08:52.595 }, 00:08:52.595 { 00:08:52.595 "name": "BaseBdev2", 00:08:52.595 "uuid": "dbc6e2dd-108f-454f-8f35-41b056cd6ef4", 00:08:52.595 "is_configured": true, 00:08:52.595 "data_offset": 2048, 00:08:52.595 "data_size": 63488 00:08:52.595 } 00:08:52.595 ] 00:08:52.595 }' 00:08:52.595 04:09:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:52.595 04:09:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:08:53.161 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:08:53.419 [2024-05-15 04:09:41.296187] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:08:53.419 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:08:53.419 "name": "Existed_Raid", 00:08:53.419 "aliases": [ 00:08:53.419 "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf" 00:08:53.419 ], 00:08:53.419 "product_name": "Raid Volume", 00:08:53.419 "block_size": 512, 00:08:53.419 "num_blocks": 126976, 00:08:53.419 "uuid": "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf", 00:08:53.419 "assigned_rate_limits": { 00:08:53.419 "rw_ios_per_sec": 0, 00:08:53.419 "rw_mbytes_per_sec": 0, 00:08:53.419 "r_mbytes_per_sec": 0, 00:08:53.419 "w_mbytes_per_sec": 0 00:08:53.419 }, 00:08:53.419 "claimed": false, 00:08:53.419 "zoned": false, 00:08:53.419 "supported_io_types": { 00:08:53.419 "read": true, 00:08:53.419 "write": true, 00:08:53.419 "unmap": true, 00:08:53.419 "write_zeroes": true, 00:08:53.419 "flush": true, 00:08:53.419 "reset": true, 00:08:53.419 "compare": false, 00:08:53.419 "compare_and_write": false, 00:08:53.419 "abort": false, 00:08:53.419 "nvme_admin": false, 00:08:53.419 "nvme_io": false 00:08:53.419 }, 00:08:53.419 "memory_domains": [ 00:08:53.419 { 00:08:53.419 "dma_device_id": "system", 00:08:53.419 "dma_device_type": 1 00:08:53.419 }, 00:08:53.419 { 00:08:53.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.419 "dma_device_type": 2 00:08:53.419 }, 00:08:53.419 { 00:08:53.419 "dma_device_id": "system", 00:08:53.419 "dma_device_type": 1 00:08:53.419 }, 00:08:53.419 { 00:08:53.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.419 "dma_device_type": 2 00:08:53.419 } 00:08:53.419 ], 00:08:53.419 "driver_specific": { 00:08:53.419 "raid": { 00:08:53.419 "uuid": "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf", 00:08:53.419 "strip_size_kb": 64, 00:08:53.419 "state": "online", 00:08:53.419 "raid_level": "raid0", 00:08:53.419 "superblock": true, 00:08:53.419 "num_base_bdevs": 2, 00:08:53.419 "num_base_bdevs_discovered": 2, 00:08:53.419 "num_base_bdevs_operational": 2, 00:08:53.419 "base_bdevs_list": [ 00:08:53.419 { 00:08:53.419 "name": "BaseBdev1", 00:08:53.419 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:53.419 "is_configured": true, 00:08:53.419 "data_offset": 2048, 00:08:53.419 "data_size": 63488 00:08:53.419 }, 00:08:53.419 { 00:08:53.419 "name": "BaseBdev2", 00:08:53.419 "uuid": "dbc6e2dd-108f-454f-8f35-41b056cd6ef4", 00:08:53.419 "is_configured": true, 00:08:53.419 "data_offset": 2048, 00:08:53.419 "data_size": 63488 00:08:53.419 } 00:08:53.420 ] 00:08:53.420 } 00:08:53.420 } 00:08:53.420 }' 00:08:53.420 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:08:53.420 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:08:53.420 BaseBdev2' 00:08:53.420 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:08:53.420 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:08:53.420 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:08:53.678 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:08:53.678 "name": "BaseBdev1", 00:08:53.678 "aliases": [ 00:08:53.678 "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8" 00:08:53.678 ], 00:08:53.678 "product_name": "Malloc disk", 00:08:53.678 "block_size": 512, 00:08:53.678 "num_blocks": 65536, 00:08:53.678 "uuid": "1f4ab9c1-378d-45cb-89ee-8df12ad3d2d8", 00:08:53.678 "assigned_rate_limits": { 00:08:53.678 "rw_ios_per_sec": 0, 00:08:53.678 "rw_mbytes_per_sec": 0, 00:08:53.678 "r_mbytes_per_sec": 0, 00:08:53.678 "w_mbytes_per_sec": 0 00:08:53.678 }, 00:08:53.678 "claimed": true, 00:08:53.678 "claim_type": "exclusive_write", 00:08:53.678 "zoned": false, 00:08:53.678 "supported_io_types": { 00:08:53.678 "read": true, 00:08:53.678 "write": true, 00:08:53.678 "unmap": true, 00:08:53.678 "write_zeroes": true, 00:08:53.678 "flush": true, 00:08:53.678 "reset": true, 00:08:53.678 "compare": false, 00:08:53.678 "compare_and_write": false, 00:08:53.678 "abort": true, 00:08:53.678 "nvme_admin": false, 00:08:53.678 "nvme_io": false 00:08:53.678 }, 00:08:53.678 "memory_domains": [ 00:08:53.678 { 00:08:53.678 "dma_device_id": "system", 00:08:53.678 "dma_device_type": 1 00:08:53.678 }, 00:08:53.678 { 00:08:53.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:53.678 "dma_device_type": 2 00:08:53.678 } 00:08:53.678 ], 00:08:53.678 "driver_specific": {} 00:08:53.678 }' 00:08:53.678 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:53.678 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:53.678 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:08:53.678 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:08:53.937 04:09:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:08:54.195 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:08:54.195 "name": "BaseBdev2", 00:08:54.195 "aliases": [ 00:08:54.195 "dbc6e2dd-108f-454f-8f35-41b056cd6ef4" 00:08:54.195 ], 00:08:54.195 "product_name": "Malloc disk", 00:08:54.195 "block_size": 512, 00:08:54.195 "num_blocks": 65536, 00:08:54.195 "uuid": "dbc6e2dd-108f-454f-8f35-41b056cd6ef4", 00:08:54.195 "assigned_rate_limits": { 00:08:54.195 "rw_ios_per_sec": 0, 00:08:54.195 "rw_mbytes_per_sec": 0, 00:08:54.195 "r_mbytes_per_sec": 0, 00:08:54.195 "w_mbytes_per_sec": 0 00:08:54.195 }, 00:08:54.195 "claimed": true, 00:08:54.195 "claim_type": "exclusive_write", 00:08:54.195 "zoned": false, 00:08:54.195 "supported_io_types": { 00:08:54.195 "read": true, 00:08:54.195 "write": true, 00:08:54.195 "unmap": true, 00:08:54.195 "write_zeroes": true, 00:08:54.195 "flush": true, 00:08:54.195 "reset": true, 00:08:54.195 "compare": false, 00:08:54.195 "compare_and_write": false, 00:08:54.195 "abort": true, 00:08:54.195 "nvme_admin": false, 00:08:54.195 "nvme_io": false 00:08:54.195 }, 00:08:54.195 "memory_domains": [ 00:08:54.195 { 00:08:54.195 "dma_device_id": "system", 00:08:54.195 "dma_device_type": 1 00:08:54.195 }, 00:08:54.195 { 00:08:54.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:54.195 "dma_device_type": 2 00:08:54.195 } 00:08:54.195 ], 00:08:54.195 "driver_specific": {} 00:08:54.195 }' 00:08:54.195 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:54.453 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:08:54.712 [2024-05-15 04:09:42.687711] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:08:54.712 [2024-05-15 04:09:42.687739] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:08:54.712 [2024-05-15 04:09:42.687790] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:54.712 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:08:55.278 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:55.278 "name": "Existed_Raid", 00:08:55.278 "uuid": "10ea50e6-7fe7-4b20-8f21-903aa43bcbbf", 00:08:55.278 "strip_size_kb": 64, 00:08:55.278 "state": "offline", 00:08:55.278 "raid_level": "raid0", 00:08:55.278 "superblock": true, 00:08:55.278 "num_base_bdevs": 2, 00:08:55.278 "num_base_bdevs_discovered": 1, 00:08:55.278 "num_base_bdevs_operational": 1, 00:08:55.278 "base_bdevs_list": [ 00:08:55.278 { 00:08:55.278 "name": null, 00:08:55.278 "uuid": "00000000-0000-0000-0000-000000000000", 00:08:55.278 "is_configured": false, 00:08:55.278 "data_offset": 2048, 00:08:55.278 "data_size": 63488 00:08:55.278 }, 00:08:55.278 { 00:08:55.278 "name": "BaseBdev2", 00:08:55.278 "uuid": "dbc6e2dd-108f-454f-8f35-41b056cd6ef4", 00:08:55.278 "is_configured": true, 00:08:55.278 "data_offset": 2048, 00:08:55.278 "data_size": 63488 00:08:55.278 } 00:08:55.278 ] 00:08:55.278 }' 00:08:55.278 04:09:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:55.278 04:09:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:55.536 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:08:55.536 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:08:55.803 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:55.803 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:08:55.803 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:08:55.803 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:08:55.803 04:09:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:08:56.061 [2024-05-15 04:09:44.026409] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:08:56.061 [2024-05-15 04:09:44.026464] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14576e0 name Existed_Raid, state offline 00:08:56.061 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:08:56.061 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:08:56.061 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:56.061 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3826134 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3826134 ']' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3826134 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3826134 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3826134' 00:08:56.319 killing process with pid 3826134 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3826134 00:08:56.319 [2024-05-15 04:09:44.330546] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:08:56.319 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3826134 00:08:56.319 [2024-05-15 04:09:44.331693] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:08:56.884 04:09:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:08:56.884 00:08:56.884 real 0m10.377s 00:08:56.884 user 0m18.767s 00:08:56.884 sys 0m1.459s 00:08:56.884 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:56.884 04:09:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:08:56.884 ************************************ 00:08:56.884 END TEST raid_state_function_test_sb 00:08:56.884 ************************************ 00:08:56.884 04:09:44 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:08:56.884 04:09:44 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:08:56.884 04:09:44 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:56.884 04:09:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:08:56.884 ************************************ 00:08:56.884 START TEST raid_superblock_test 00:08:56.884 ************************************ 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 2 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3827568 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3827568 /var/tmp/spdk-raid.sock 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3827568 ']' 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:08:56.884 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:08:56.884 04:09:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:08:56.884 [2024-05-15 04:09:44.701997] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:08:56.884 [2024-05-15 04:09:44.702063] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3827568 ] 00:08:56.884 [2024-05-15 04:09:44.777287] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.884 [2024-05-15 04:09:44.886523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.142 [2024-05-15 04:09:44.964145] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:57.142 [2024-05-15 04:09:44.964193] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:08:57.706 04:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:08:57.706 04:09:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:08:57.706 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:08:57.706 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:08:57.706 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:08:57.707 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:08:57.964 malloc1 00:08:57.964 04:09:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:08:58.221 [2024-05-15 04:09:46.171791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:08:58.221 [2024-05-15 04:09:46.171874] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:58.221 [2024-05-15 04:09:46.171906] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb38c20 00:08:58.221 [2024-05-15 04:09:46.171920] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:58.221 [2024-05-15 04:09:46.173589] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:58.221 [2024-05-15 04:09:46.173614] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:08:58.221 pt1 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:08:58.221 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:08:58.479 malloc2 00:08:58.479 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:08:58.739 [2024-05-15 04:09:46.672627] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:08:58.739 [2024-05-15 04:09:46.672684] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:58.739 [2024-05-15 04:09:46.672709] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb30c00 00:08:58.739 [2024-05-15 04:09:46.672722] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:58.739 [2024-05-15 04:09:46.674527] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:58.739 [2024-05-15 04:09:46.674552] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:08:58.739 pt2 00:08:58.739 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:08:58.739 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:08:58.739 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:08:59.028 [2024-05-15 04:09:46.961428] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:08:59.028 [2024-05-15 04:09:46.962798] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:08:59.028 [2024-05-15 04:09:46.963006] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xb31230 00:08:59.028 [2024-05-15 04:09:46.963022] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:08:59.028 [2024-05-15 04:09:46.963251] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb4fb10 00:08:59.028 [2024-05-15 04:09:46.963418] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb31230 00:08:59.028 [2024-05-15 04:09:46.963447] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb31230 00:08:59.028 [2024-05-15 04:09:46.963583] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:08:59.028 04:09:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:08:59.286 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:08:59.286 "name": "raid_bdev1", 00:08:59.286 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:08:59.286 "strip_size_kb": 64, 00:08:59.286 "state": "online", 00:08:59.286 "raid_level": "raid0", 00:08:59.286 "superblock": true, 00:08:59.286 "num_base_bdevs": 2, 00:08:59.286 "num_base_bdevs_discovered": 2, 00:08:59.286 "num_base_bdevs_operational": 2, 00:08:59.286 "base_bdevs_list": [ 00:08:59.286 { 00:08:59.286 "name": "pt1", 00:08:59.286 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:08:59.286 "is_configured": true, 00:08:59.286 "data_offset": 2048, 00:08:59.286 "data_size": 63488 00:08:59.286 }, 00:08:59.286 { 00:08:59.286 "name": "pt2", 00:08:59.286 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:08:59.286 "is_configured": true, 00:08:59.286 "data_offset": 2048, 00:08:59.286 "data_size": 63488 00:08:59.286 } 00:08:59.286 ] 00:08:59.286 }' 00:08:59.286 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:08:59.286 04:09:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:08:59.861 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:00.119 [2024-05-15 04:09:47.968257] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:00.119 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:00.119 "name": "raid_bdev1", 00:09:00.119 "aliases": [ 00:09:00.119 "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2" 00:09:00.119 ], 00:09:00.119 "product_name": "Raid Volume", 00:09:00.119 "block_size": 512, 00:09:00.119 "num_blocks": 126976, 00:09:00.119 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:00.119 "assigned_rate_limits": { 00:09:00.119 "rw_ios_per_sec": 0, 00:09:00.119 "rw_mbytes_per_sec": 0, 00:09:00.119 "r_mbytes_per_sec": 0, 00:09:00.119 "w_mbytes_per_sec": 0 00:09:00.119 }, 00:09:00.119 "claimed": false, 00:09:00.119 "zoned": false, 00:09:00.119 "supported_io_types": { 00:09:00.119 "read": true, 00:09:00.119 "write": true, 00:09:00.119 "unmap": true, 00:09:00.119 "write_zeroes": true, 00:09:00.119 "flush": true, 00:09:00.119 "reset": true, 00:09:00.119 "compare": false, 00:09:00.119 "compare_and_write": false, 00:09:00.119 "abort": false, 00:09:00.119 "nvme_admin": false, 00:09:00.119 "nvme_io": false 00:09:00.119 }, 00:09:00.119 "memory_domains": [ 00:09:00.119 { 00:09:00.119 "dma_device_id": "system", 00:09:00.119 "dma_device_type": 1 00:09:00.119 }, 00:09:00.119 { 00:09:00.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.119 "dma_device_type": 2 00:09:00.119 }, 00:09:00.119 { 00:09:00.119 "dma_device_id": "system", 00:09:00.119 "dma_device_type": 1 00:09:00.119 }, 00:09:00.119 { 00:09:00.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.119 "dma_device_type": 2 00:09:00.119 } 00:09:00.119 ], 00:09:00.119 "driver_specific": { 00:09:00.119 "raid": { 00:09:00.119 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:00.119 "strip_size_kb": 64, 00:09:00.119 "state": "online", 00:09:00.119 "raid_level": "raid0", 00:09:00.119 "superblock": true, 00:09:00.119 "num_base_bdevs": 2, 00:09:00.119 "num_base_bdevs_discovered": 2, 00:09:00.119 "num_base_bdevs_operational": 2, 00:09:00.119 "base_bdevs_list": [ 00:09:00.119 { 00:09:00.119 "name": "pt1", 00:09:00.119 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:00.119 "is_configured": true, 00:09:00.119 "data_offset": 2048, 00:09:00.119 "data_size": 63488 00:09:00.119 }, 00:09:00.119 { 00:09:00.119 "name": "pt2", 00:09:00.119 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:00.119 "is_configured": true, 00:09:00.119 "data_offset": 2048, 00:09:00.119 "data_size": 63488 00:09:00.119 } 00:09:00.119 ] 00:09:00.119 } 00:09:00.119 } 00:09:00.119 }' 00:09:00.119 04:09:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:00.119 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:09:00.119 pt2' 00:09:00.119 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:00.119 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:00.119 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:00.377 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:00.377 "name": "pt1", 00:09:00.377 "aliases": [ 00:09:00.377 "bcc7477e-8b6b-5b17-9ac1-5592256000b5" 00:09:00.377 ], 00:09:00.377 "product_name": "passthru", 00:09:00.377 "block_size": 512, 00:09:00.377 "num_blocks": 65536, 00:09:00.377 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:00.377 "assigned_rate_limits": { 00:09:00.377 "rw_ios_per_sec": 0, 00:09:00.377 "rw_mbytes_per_sec": 0, 00:09:00.377 "r_mbytes_per_sec": 0, 00:09:00.377 "w_mbytes_per_sec": 0 00:09:00.377 }, 00:09:00.377 "claimed": true, 00:09:00.377 "claim_type": "exclusive_write", 00:09:00.377 "zoned": false, 00:09:00.377 "supported_io_types": { 00:09:00.377 "read": true, 00:09:00.377 "write": true, 00:09:00.377 "unmap": true, 00:09:00.377 "write_zeroes": true, 00:09:00.377 "flush": true, 00:09:00.377 "reset": true, 00:09:00.377 "compare": false, 00:09:00.377 "compare_and_write": false, 00:09:00.377 "abort": true, 00:09:00.377 "nvme_admin": false, 00:09:00.377 "nvme_io": false 00:09:00.377 }, 00:09:00.377 "memory_domains": [ 00:09:00.377 { 00:09:00.377 "dma_device_id": "system", 00:09:00.377 "dma_device_type": 1 00:09:00.377 }, 00:09:00.377 { 00:09:00.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.378 "dma_device_type": 2 00:09:00.378 } 00:09:00.378 ], 00:09:00.378 "driver_specific": { 00:09:00.378 "passthru": { 00:09:00.378 "name": "pt1", 00:09:00.378 "base_bdev_name": "malloc1" 00:09:00.378 } 00:09:00.378 } 00:09:00.378 }' 00:09:00.378 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:00.378 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:00.378 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:00.378 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:00.378 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:00.635 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:00.893 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:00.893 "name": "pt2", 00:09:00.893 "aliases": [ 00:09:00.893 "e491dfb6-9d71-5e00-a377-63abeec84d1b" 00:09:00.893 ], 00:09:00.893 "product_name": "passthru", 00:09:00.893 "block_size": 512, 00:09:00.893 "num_blocks": 65536, 00:09:00.893 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:00.893 "assigned_rate_limits": { 00:09:00.893 "rw_ios_per_sec": 0, 00:09:00.893 "rw_mbytes_per_sec": 0, 00:09:00.893 "r_mbytes_per_sec": 0, 00:09:00.893 "w_mbytes_per_sec": 0 00:09:00.893 }, 00:09:00.893 "claimed": true, 00:09:00.893 "claim_type": "exclusive_write", 00:09:00.893 "zoned": false, 00:09:00.893 "supported_io_types": { 00:09:00.893 "read": true, 00:09:00.893 "write": true, 00:09:00.893 "unmap": true, 00:09:00.893 "write_zeroes": true, 00:09:00.893 "flush": true, 00:09:00.893 "reset": true, 00:09:00.893 "compare": false, 00:09:00.893 "compare_and_write": false, 00:09:00.893 "abort": true, 00:09:00.893 "nvme_admin": false, 00:09:00.893 "nvme_io": false 00:09:00.893 }, 00:09:00.893 "memory_domains": [ 00:09:00.893 { 00:09:00.893 "dma_device_id": "system", 00:09:00.893 "dma_device_type": 1 00:09:00.893 }, 00:09:00.893 { 00:09:00.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:00.893 "dma_device_type": 2 00:09:00.893 } 00:09:00.893 ], 00:09:00.893 "driver_specific": { 00:09:00.893 "passthru": { 00:09:00.893 "name": "pt2", 00:09:00.893 "base_bdev_name": "malloc2" 00:09:00.893 } 00:09:00.893 } 00:09:00.893 }' 00:09:00.893 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:00.893 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:00.893 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:00.893 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:01.151 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:01.151 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:01.151 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:01.151 04:09:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:01.151 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:09:01.410 [2024-05-15 04:09:49.336025] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:01.410 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2 00:09:01.410 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2 ']' 00:09:01.410 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:01.668 [2024-05-15 04:09:49.616581] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:01.668 [2024-05-15 04:09:49.616611] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:01.668 [2024-05-15 04:09:49.616701] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:01.668 [2024-05-15 04:09:49.616759] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:01.668 [2024-05-15 04:09:49.616774] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb31230 name raid_bdev1, state offline 00:09:01.668 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:01.668 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:09:01.926 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:09:01.926 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:09:01.926 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:09:01.926 04:09:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:02.184 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:09:02.184 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:02.441 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:02.441 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:02.699 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:09:02.957 [2024-05-15 04:09:50.887974] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:02.957 [2024-05-15 04:09:50.889415] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:02.957 [2024-05-15 04:09:50.889484] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:02.957 [2024-05-15 04:09:50.889549] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:02.957 [2024-05-15 04:09:50.889584] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:02.957 [2024-05-15 04:09:50.889598] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb39800 name raid_bdev1, state configuring 00:09:02.957 request: 00:09:02.957 { 00:09:02.957 "name": "raid_bdev1", 00:09:02.957 "raid_level": "raid0", 00:09:02.957 "base_bdevs": [ 00:09:02.957 "malloc1", 00:09:02.957 "malloc2" 00:09:02.957 ], 00:09:02.957 "superblock": false, 00:09:02.957 "strip_size_kb": 64, 00:09:02.957 "method": "bdev_raid_create", 00:09:02.957 "req_id": 1 00:09:02.957 } 00:09:02.957 Got JSON-RPC error response 00:09:02.957 response: 00:09:02.957 { 00:09:02.957 "code": -17, 00:09:02.957 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:02.957 } 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:02.957 04:09:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:09:03.215 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:09:03.215 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:09:03.215 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:03.473 [2024-05-15 04:09:51.373207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:03.473 [2024-05-15 04:09:51.373262] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:03.473 [2024-05-15 04:09:51.373288] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb390e0 00:09:03.473 [2024-05-15 04:09:51.373300] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:03.473 [2024-05-15 04:09:51.374891] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:03.473 [2024-05-15 04:09:51.374916] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:03.473 [2024-05-15 04:09:51.375001] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:03.473 [2024-05-15 04:09:51.375039] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:03.473 pt1 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:03.473 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:03.731 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:03.731 "name": "raid_bdev1", 00:09:03.731 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:03.731 "strip_size_kb": 64, 00:09:03.731 "state": "configuring", 00:09:03.731 "raid_level": "raid0", 00:09:03.731 "superblock": true, 00:09:03.731 "num_base_bdevs": 2, 00:09:03.731 "num_base_bdevs_discovered": 1, 00:09:03.731 "num_base_bdevs_operational": 2, 00:09:03.731 "base_bdevs_list": [ 00:09:03.731 { 00:09:03.731 "name": "pt1", 00:09:03.731 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:03.731 "is_configured": true, 00:09:03.731 "data_offset": 2048, 00:09:03.731 "data_size": 63488 00:09:03.731 }, 00:09:03.731 { 00:09:03.731 "name": null, 00:09:03.731 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:03.731 "is_configured": false, 00:09:03.731 "data_offset": 2048, 00:09:03.731 "data_size": 63488 00:09:03.731 } 00:09:03.731 ] 00:09:03.731 }' 00:09:03.731 04:09:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:03.731 04:09:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:04.296 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:09:04.296 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:09:04.296 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:09:04.296 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:04.552 [2024-05-15 04:09:52.440078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:04.552 [2024-05-15 04:09:52.440150] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:04.553 [2024-05-15 04:09:52.440177] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb32650 00:09:04.553 [2024-05-15 04:09:52.440193] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:04.553 [2024-05-15 04:09:52.440593] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:04.553 [2024-05-15 04:09:52.440621] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:04.553 [2024-05-15 04:09:52.440707] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:04.553 [2024-05-15 04:09:52.440736] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:04.553 [2024-05-15 04:09:52.440871] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xb36840 00:09:04.553 [2024-05-15 04:09:52.440889] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:04.553 [2024-05-15 04:09:52.441060] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb374c0 00:09:04.553 [2024-05-15 04:09:52.441219] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb36840 00:09:04.553 [2024-05-15 04:09:52.441234] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb36840 00:09:04.553 [2024-05-15 04:09:52.441345] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:04.553 pt2 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:04.553 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:04.809 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:04.809 "name": "raid_bdev1", 00:09:04.809 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:04.809 "strip_size_kb": 64, 00:09:04.809 "state": "online", 00:09:04.809 "raid_level": "raid0", 00:09:04.809 "superblock": true, 00:09:04.809 "num_base_bdevs": 2, 00:09:04.809 "num_base_bdevs_discovered": 2, 00:09:04.809 "num_base_bdevs_operational": 2, 00:09:04.809 "base_bdevs_list": [ 00:09:04.809 { 00:09:04.809 "name": "pt1", 00:09:04.809 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:04.809 "is_configured": true, 00:09:04.809 "data_offset": 2048, 00:09:04.809 "data_size": 63488 00:09:04.809 }, 00:09:04.809 { 00:09:04.809 "name": "pt2", 00:09:04.809 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:04.809 "is_configured": true, 00:09:04.809 "data_offset": 2048, 00:09:04.810 "data_size": 63488 00:09:04.810 } 00:09:04.810 ] 00:09:04.810 }' 00:09:04.810 04:09:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:04.810 04:09:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:05.373 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:09:05.373 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:09:05.373 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:05.373 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:05.373 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:05.374 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:09:05.374 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:05.374 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:05.630 [2024-05-15 04:09:53.507110] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:05.630 "name": "raid_bdev1", 00:09:05.630 "aliases": [ 00:09:05.630 "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2" 00:09:05.630 ], 00:09:05.630 "product_name": "Raid Volume", 00:09:05.630 "block_size": 512, 00:09:05.630 "num_blocks": 126976, 00:09:05.630 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:05.630 "assigned_rate_limits": { 00:09:05.630 "rw_ios_per_sec": 0, 00:09:05.630 "rw_mbytes_per_sec": 0, 00:09:05.630 "r_mbytes_per_sec": 0, 00:09:05.630 "w_mbytes_per_sec": 0 00:09:05.630 }, 00:09:05.630 "claimed": false, 00:09:05.630 "zoned": false, 00:09:05.630 "supported_io_types": { 00:09:05.630 "read": true, 00:09:05.630 "write": true, 00:09:05.630 "unmap": true, 00:09:05.630 "write_zeroes": true, 00:09:05.630 "flush": true, 00:09:05.630 "reset": true, 00:09:05.630 "compare": false, 00:09:05.630 "compare_and_write": false, 00:09:05.630 "abort": false, 00:09:05.630 "nvme_admin": false, 00:09:05.630 "nvme_io": false 00:09:05.630 }, 00:09:05.630 "memory_domains": [ 00:09:05.630 { 00:09:05.630 "dma_device_id": "system", 00:09:05.630 "dma_device_type": 1 00:09:05.630 }, 00:09:05.630 { 00:09:05.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.630 "dma_device_type": 2 00:09:05.630 }, 00:09:05.630 { 00:09:05.630 "dma_device_id": "system", 00:09:05.630 "dma_device_type": 1 00:09:05.630 }, 00:09:05.630 { 00:09:05.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.630 "dma_device_type": 2 00:09:05.630 } 00:09:05.630 ], 00:09:05.630 "driver_specific": { 00:09:05.630 "raid": { 00:09:05.630 "uuid": "e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2", 00:09:05.630 "strip_size_kb": 64, 00:09:05.630 "state": "online", 00:09:05.630 "raid_level": "raid0", 00:09:05.630 "superblock": true, 00:09:05.630 "num_base_bdevs": 2, 00:09:05.630 "num_base_bdevs_discovered": 2, 00:09:05.630 "num_base_bdevs_operational": 2, 00:09:05.630 "base_bdevs_list": [ 00:09:05.630 { 00:09:05.630 "name": "pt1", 00:09:05.630 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:05.630 "is_configured": true, 00:09:05.630 "data_offset": 2048, 00:09:05.630 "data_size": 63488 00:09:05.630 }, 00:09:05.630 { 00:09:05.630 "name": "pt2", 00:09:05.630 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:05.630 "is_configured": true, 00:09:05.630 "data_offset": 2048, 00:09:05.630 "data_size": 63488 00:09:05.630 } 00:09:05.630 ] 00:09:05.630 } 00:09:05.630 } 00:09:05.630 }' 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:09:05.630 pt2' 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:05.630 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:05.887 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:05.887 "name": "pt1", 00:09:05.887 "aliases": [ 00:09:05.887 "bcc7477e-8b6b-5b17-9ac1-5592256000b5" 00:09:05.887 ], 00:09:05.887 "product_name": "passthru", 00:09:05.887 "block_size": 512, 00:09:05.887 "num_blocks": 65536, 00:09:05.887 "uuid": "bcc7477e-8b6b-5b17-9ac1-5592256000b5", 00:09:05.887 "assigned_rate_limits": { 00:09:05.887 "rw_ios_per_sec": 0, 00:09:05.887 "rw_mbytes_per_sec": 0, 00:09:05.887 "r_mbytes_per_sec": 0, 00:09:05.887 "w_mbytes_per_sec": 0 00:09:05.887 }, 00:09:05.887 "claimed": true, 00:09:05.887 "claim_type": "exclusive_write", 00:09:05.887 "zoned": false, 00:09:05.887 "supported_io_types": { 00:09:05.887 "read": true, 00:09:05.887 "write": true, 00:09:05.887 "unmap": true, 00:09:05.887 "write_zeroes": true, 00:09:05.887 "flush": true, 00:09:05.887 "reset": true, 00:09:05.887 "compare": false, 00:09:05.887 "compare_and_write": false, 00:09:05.887 "abort": true, 00:09:05.887 "nvme_admin": false, 00:09:05.887 "nvme_io": false 00:09:05.887 }, 00:09:05.887 "memory_domains": [ 00:09:05.887 { 00:09:05.887 "dma_device_id": "system", 00:09:05.887 "dma_device_type": 1 00:09:05.887 }, 00:09:05.887 { 00:09:05.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:05.887 "dma_device_type": 2 00:09:05.887 } 00:09:05.887 ], 00:09:05.887 "driver_specific": { 00:09:05.887 "passthru": { 00:09:05.887 "name": "pt1", 00:09:05.887 "base_bdev_name": "malloc1" 00:09:05.887 } 00:09:05.887 } 00:09:05.887 }' 00:09:05.887 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:05.887 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:05.887 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:05.887 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:06.144 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:06.144 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:06.145 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:06.145 04:09:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:06.145 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:06.402 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:06.402 "name": "pt2", 00:09:06.402 "aliases": [ 00:09:06.402 "e491dfb6-9d71-5e00-a377-63abeec84d1b" 00:09:06.402 ], 00:09:06.402 "product_name": "passthru", 00:09:06.402 "block_size": 512, 00:09:06.402 "num_blocks": 65536, 00:09:06.402 "uuid": "e491dfb6-9d71-5e00-a377-63abeec84d1b", 00:09:06.402 "assigned_rate_limits": { 00:09:06.402 "rw_ios_per_sec": 0, 00:09:06.402 "rw_mbytes_per_sec": 0, 00:09:06.402 "r_mbytes_per_sec": 0, 00:09:06.402 "w_mbytes_per_sec": 0 00:09:06.402 }, 00:09:06.402 "claimed": true, 00:09:06.402 "claim_type": "exclusive_write", 00:09:06.402 "zoned": false, 00:09:06.402 "supported_io_types": { 00:09:06.402 "read": true, 00:09:06.402 "write": true, 00:09:06.402 "unmap": true, 00:09:06.402 "write_zeroes": true, 00:09:06.402 "flush": true, 00:09:06.402 "reset": true, 00:09:06.402 "compare": false, 00:09:06.402 "compare_and_write": false, 00:09:06.402 "abort": true, 00:09:06.402 "nvme_admin": false, 00:09:06.402 "nvme_io": false 00:09:06.402 }, 00:09:06.402 "memory_domains": [ 00:09:06.402 { 00:09:06.402 "dma_device_id": "system", 00:09:06.402 "dma_device_type": 1 00:09:06.402 }, 00:09:06.402 { 00:09:06.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:06.402 "dma_device_type": 2 00:09:06.402 } 00:09:06.402 ], 00:09:06.402 "driver_specific": { 00:09:06.402 "passthru": { 00:09:06.402 "name": "pt2", 00:09:06.402 "base_bdev_name": "malloc2" 00:09:06.402 } 00:09:06.402 } 00:09:06.402 }' 00:09:06.402 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:06.402 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:06.660 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:09:06.918 [2024-05-15 04:09:54.858711] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2 '!=' e4ed57cb-2c1d-4d50-9c66-7fc016e4d0c2 ']' 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3827568 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3827568 ']' 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3827568 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3827568 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3827568' 00:09:06.918 killing process with pid 3827568 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3827568 00:09:06.918 [2024-05-15 04:09:54.904035] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:06.918 04:09:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3827568 00:09:06.918 [2024-05-15 04:09:54.904126] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:06.918 [2024-05-15 04:09:54.904196] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:06.918 [2024-05-15 04:09:54.904212] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb36840 name raid_bdev1, state offline 00:09:06.918 [2024-05-15 04:09:54.925009] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:07.175 04:09:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:09:07.175 00:09:07.175 real 0m10.537s 00:09:07.175 user 0m18.980s 00:09:07.175 sys 0m1.620s 00:09:07.175 04:09:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:07.175 04:09:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:07.175 ************************************ 00:09:07.175 END TEST raid_superblock_test 00:09:07.175 ************************************ 00:09:07.433 04:09:55 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:09:07.433 04:09:55 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:09:07.433 04:09:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:07.433 04:09:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:07.433 04:09:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:07.433 ************************************ 00:09:07.433 START TEST raid_state_function_test 00:09:07.433 ************************************ 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 false 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3828994 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3828994' 00:09:07.433 Process raid pid: 3828994 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3828994 /var/tmp/spdk-raid.sock 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3828994 ']' 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:07.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:07.433 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:07.433 [2024-05-15 04:09:55.293261] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:07.433 [2024-05-15 04:09:55.293329] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:07.433 [2024-05-15 04:09:55.372263] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.691 [2024-05-15 04:09:55.481769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.691 [2024-05-15 04:09:55.551480] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:07.691 [2024-05-15 04:09:55.551525] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:07.691 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:07.691 04:09:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:09:07.691 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:07.949 [2024-05-15 04:09:55.841349] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:07.949 [2024-05-15 04:09:55.841389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:07.949 [2024-05-15 04:09:55.841400] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:07.949 [2024-05-15 04:09:55.841411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:07.949 04:09:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:08.206 04:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:08.206 "name": "Existed_Raid", 00:09:08.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:08.206 "strip_size_kb": 64, 00:09:08.206 "state": "configuring", 00:09:08.206 "raid_level": "concat", 00:09:08.206 "superblock": false, 00:09:08.206 "num_base_bdevs": 2, 00:09:08.206 "num_base_bdevs_discovered": 0, 00:09:08.206 "num_base_bdevs_operational": 2, 00:09:08.206 "base_bdevs_list": [ 00:09:08.206 { 00:09:08.206 "name": "BaseBdev1", 00:09:08.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:08.206 "is_configured": false, 00:09:08.206 "data_offset": 0, 00:09:08.206 "data_size": 0 00:09:08.206 }, 00:09:08.206 { 00:09:08.206 "name": "BaseBdev2", 00:09:08.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:08.206 "is_configured": false, 00:09:08.206 "data_offset": 0, 00:09:08.206 "data_size": 0 00:09:08.206 } 00:09:08.206 ] 00:09:08.206 }' 00:09:08.206 04:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:08.206 04:09:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:08.771 04:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:09.028 [2024-05-15 04:09:56.851934] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:09.028 [2024-05-15 04:09:56.851968] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ce000 name Existed_Raid, state configuring 00:09:09.028 04:09:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:09.286 [2024-05-15 04:09:57.092583] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:09.286 [2024-05-15 04:09:57.092617] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:09.286 [2024-05-15 04:09:57.092629] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:09.286 [2024-05-15 04:09:57.092642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:09.286 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:09.544 [2024-05-15 04:09:57.348900] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:09.544 BaseBdev1 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:09.544 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:09.801 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:10.060 [ 00:09:10.060 { 00:09:10.060 "name": "BaseBdev1", 00:09:10.060 "aliases": [ 00:09:10.060 "dd23b191-ef94-4cf3-b575-44c07552e59e" 00:09:10.060 ], 00:09:10.060 "product_name": "Malloc disk", 00:09:10.060 "block_size": 512, 00:09:10.060 "num_blocks": 65536, 00:09:10.060 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:10.060 "assigned_rate_limits": { 00:09:10.060 "rw_ios_per_sec": 0, 00:09:10.060 "rw_mbytes_per_sec": 0, 00:09:10.060 "r_mbytes_per_sec": 0, 00:09:10.060 "w_mbytes_per_sec": 0 00:09:10.060 }, 00:09:10.060 "claimed": true, 00:09:10.060 "claim_type": "exclusive_write", 00:09:10.060 "zoned": false, 00:09:10.060 "supported_io_types": { 00:09:10.060 "read": true, 00:09:10.060 "write": true, 00:09:10.060 "unmap": true, 00:09:10.060 "write_zeroes": true, 00:09:10.060 "flush": true, 00:09:10.060 "reset": true, 00:09:10.060 "compare": false, 00:09:10.060 "compare_and_write": false, 00:09:10.060 "abort": true, 00:09:10.060 "nvme_admin": false, 00:09:10.060 "nvme_io": false 00:09:10.060 }, 00:09:10.060 "memory_domains": [ 00:09:10.060 { 00:09:10.060 "dma_device_id": "system", 00:09:10.060 "dma_device_type": 1 00:09:10.060 }, 00:09:10.060 { 00:09:10.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:10.060 "dma_device_type": 2 00:09:10.060 } 00:09:10.060 ], 00:09:10.060 "driver_specific": {} 00:09:10.060 } 00:09:10.060 ] 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:10.060 04:09:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:10.318 04:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:10.318 "name": "Existed_Raid", 00:09:10.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:10.318 "strip_size_kb": 64, 00:09:10.318 "state": "configuring", 00:09:10.318 "raid_level": "concat", 00:09:10.318 "superblock": false, 00:09:10.318 "num_base_bdevs": 2, 00:09:10.318 "num_base_bdevs_discovered": 1, 00:09:10.318 "num_base_bdevs_operational": 2, 00:09:10.318 "base_bdevs_list": [ 00:09:10.318 { 00:09:10.318 "name": "BaseBdev1", 00:09:10.319 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:10.319 "is_configured": true, 00:09:10.319 "data_offset": 0, 00:09:10.319 "data_size": 65536 00:09:10.319 }, 00:09:10.319 { 00:09:10.319 "name": "BaseBdev2", 00:09:10.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:10.319 "is_configured": false, 00:09:10.319 "data_offset": 0, 00:09:10.319 "data_size": 0 00:09:10.319 } 00:09:10.319 ] 00:09:10.319 }' 00:09:10.319 04:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:10.319 04:09:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:10.884 04:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:10.884 [2024-05-15 04:09:58.828851] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:10.884 [2024-05-15 04:09:58.828905] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20cd8f0 name Existed_Raid, state configuring 00:09:10.884 04:09:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:11.142 [2024-05-15 04:09:59.069502] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:11.142 [2024-05-15 04:09:59.071073] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:11.142 [2024-05-15 04:09:59.071109] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:11.142 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:11.401 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:11.401 "name": "Existed_Raid", 00:09:11.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:11.402 "strip_size_kb": 64, 00:09:11.402 "state": "configuring", 00:09:11.402 "raid_level": "concat", 00:09:11.402 "superblock": false, 00:09:11.402 "num_base_bdevs": 2, 00:09:11.402 "num_base_bdevs_discovered": 1, 00:09:11.402 "num_base_bdevs_operational": 2, 00:09:11.402 "base_bdevs_list": [ 00:09:11.402 { 00:09:11.402 "name": "BaseBdev1", 00:09:11.402 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:11.402 "is_configured": true, 00:09:11.402 "data_offset": 0, 00:09:11.402 "data_size": 65536 00:09:11.402 }, 00:09:11.402 { 00:09:11.402 "name": "BaseBdev2", 00:09:11.402 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:11.402 "is_configured": false, 00:09:11.402 "data_offset": 0, 00:09:11.402 "data_size": 0 00:09:11.402 } 00:09:11.402 ] 00:09:11.402 }' 00:09:11.402 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:11.402 04:09:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:11.968 04:09:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:12.226 [2024-05-15 04:10:00.106163] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:12.226 [2024-05-15 04:10:00.106220] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ce6e0 00:09:12.226 [2024-05-15 04:10:00.106231] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:12.226 [2024-05-15 04:10:00.106440] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c4e60 00:09:12.226 [2024-05-15 04:10:00.106602] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ce6e0 00:09:12.226 [2024-05-15 04:10:00.106619] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20ce6e0 00:09:12.226 [2024-05-15 04:10:00.106861] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:12.226 BaseBdev2 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:12.226 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:12.484 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:12.742 [ 00:09:12.742 { 00:09:12.742 "name": "BaseBdev2", 00:09:12.742 "aliases": [ 00:09:12.742 "bb572962-c061-48d3-ba31-1d1ac2489045" 00:09:12.742 ], 00:09:12.742 "product_name": "Malloc disk", 00:09:12.742 "block_size": 512, 00:09:12.742 "num_blocks": 65536, 00:09:12.742 "uuid": "bb572962-c061-48d3-ba31-1d1ac2489045", 00:09:12.742 "assigned_rate_limits": { 00:09:12.742 "rw_ios_per_sec": 0, 00:09:12.742 "rw_mbytes_per_sec": 0, 00:09:12.742 "r_mbytes_per_sec": 0, 00:09:12.742 "w_mbytes_per_sec": 0 00:09:12.742 }, 00:09:12.742 "claimed": true, 00:09:12.742 "claim_type": "exclusive_write", 00:09:12.742 "zoned": false, 00:09:12.742 "supported_io_types": { 00:09:12.742 "read": true, 00:09:12.742 "write": true, 00:09:12.742 "unmap": true, 00:09:12.742 "write_zeroes": true, 00:09:12.742 "flush": true, 00:09:12.742 "reset": true, 00:09:12.742 "compare": false, 00:09:12.742 "compare_and_write": false, 00:09:12.742 "abort": true, 00:09:12.742 "nvme_admin": false, 00:09:12.742 "nvme_io": false 00:09:12.742 }, 00:09:12.742 "memory_domains": [ 00:09:12.742 { 00:09:12.742 "dma_device_id": "system", 00:09:12.742 "dma_device_type": 1 00:09:12.742 }, 00:09:12.742 { 00:09:12.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:12.742 "dma_device_type": 2 00:09:12.742 } 00:09:12.742 ], 00:09:12.742 "driver_specific": {} 00:09:12.742 } 00:09:12.742 ] 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:12.742 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:13.000 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:13.000 "name": "Existed_Raid", 00:09:13.000 "uuid": "697d0950-ad97-4089-b5ef-f07e85c9ed0c", 00:09:13.000 "strip_size_kb": 64, 00:09:13.000 "state": "online", 00:09:13.000 "raid_level": "concat", 00:09:13.000 "superblock": false, 00:09:13.000 "num_base_bdevs": 2, 00:09:13.000 "num_base_bdevs_discovered": 2, 00:09:13.000 "num_base_bdevs_operational": 2, 00:09:13.000 "base_bdevs_list": [ 00:09:13.000 { 00:09:13.000 "name": "BaseBdev1", 00:09:13.000 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:13.000 "is_configured": true, 00:09:13.000 "data_offset": 0, 00:09:13.000 "data_size": 65536 00:09:13.000 }, 00:09:13.000 { 00:09:13.000 "name": "BaseBdev2", 00:09:13.000 "uuid": "bb572962-c061-48d3-ba31-1d1ac2489045", 00:09:13.000 "is_configured": true, 00:09:13.000 "data_offset": 0, 00:09:13.000 "data_size": 65536 00:09:13.000 } 00:09:13.000 ] 00:09:13.000 }' 00:09:13.000 04:10:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:13.000 04:10:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:13.576 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:13.836 [2024-05-15 04:10:01.630494] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:13.836 "name": "Existed_Raid", 00:09:13.836 "aliases": [ 00:09:13.836 "697d0950-ad97-4089-b5ef-f07e85c9ed0c" 00:09:13.836 ], 00:09:13.836 "product_name": "Raid Volume", 00:09:13.836 "block_size": 512, 00:09:13.836 "num_blocks": 131072, 00:09:13.836 "uuid": "697d0950-ad97-4089-b5ef-f07e85c9ed0c", 00:09:13.836 "assigned_rate_limits": { 00:09:13.836 "rw_ios_per_sec": 0, 00:09:13.836 "rw_mbytes_per_sec": 0, 00:09:13.836 "r_mbytes_per_sec": 0, 00:09:13.836 "w_mbytes_per_sec": 0 00:09:13.836 }, 00:09:13.836 "claimed": false, 00:09:13.836 "zoned": false, 00:09:13.836 "supported_io_types": { 00:09:13.836 "read": true, 00:09:13.836 "write": true, 00:09:13.836 "unmap": true, 00:09:13.836 "write_zeroes": true, 00:09:13.836 "flush": true, 00:09:13.836 "reset": true, 00:09:13.836 "compare": false, 00:09:13.836 "compare_and_write": false, 00:09:13.836 "abort": false, 00:09:13.836 "nvme_admin": false, 00:09:13.836 "nvme_io": false 00:09:13.836 }, 00:09:13.836 "memory_domains": [ 00:09:13.836 { 00:09:13.836 "dma_device_id": "system", 00:09:13.836 "dma_device_type": 1 00:09:13.836 }, 00:09:13.836 { 00:09:13.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:13.836 "dma_device_type": 2 00:09:13.836 }, 00:09:13.836 { 00:09:13.836 "dma_device_id": "system", 00:09:13.836 "dma_device_type": 1 00:09:13.836 }, 00:09:13.836 { 00:09:13.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:13.836 "dma_device_type": 2 00:09:13.836 } 00:09:13.836 ], 00:09:13.836 "driver_specific": { 00:09:13.836 "raid": { 00:09:13.836 "uuid": "697d0950-ad97-4089-b5ef-f07e85c9ed0c", 00:09:13.836 "strip_size_kb": 64, 00:09:13.836 "state": "online", 00:09:13.836 "raid_level": "concat", 00:09:13.836 "superblock": false, 00:09:13.836 "num_base_bdevs": 2, 00:09:13.836 "num_base_bdevs_discovered": 2, 00:09:13.836 "num_base_bdevs_operational": 2, 00:09:13.836 "base_bdevs_list": [ 00:09:13.836 { 00:09:13.836 "name": "BaseBdev1", 00:09:13.836 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:13.836 "is_configured": true, 00:09:13.836 "data_offset": 0, 00:09:13.836 "data_size": 65536 00:09:13.836 }, 00:09:13.836 { 00:09:13.836 "name": "BaseBdev2", 00:09:13.836 "uuid": "bb572962-c061-48d3-ba31-1d1ac2489045", 00:09:13.836 "is_configured": true, 00:09:13.836 "data_offset": 0, 00:09:13.836 "data_size": 65536 00:09:13.836 } 00:09:13.836 ] 00:09:13.836 } 00:09:13.836 } 00:09:13.836 }' 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:09:13.836 BaseBdev2' 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:13.836 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:14.093 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:14.093 "name": "BaseBdev1", 00:09:14.093 "aliases": [ 00:09:14.093 "dd23b191-ef94-4cf3-b575-44c07552e59e" 00:09:14.093 ], 00:09:14.093 "product_name": "Malloc disk", 00:09:14.093 "block_size": 512, 00:09:14.093 "num_blocks": 65536, 00:09:14.093 "uuid": "dd23b191-ef94-4cf3-b575-44c07552e59e", 00:09:14.093 "assigned_rate_limits": { 00:09:14.093 "rw_ios_per_sec": 0, 00:09:14.093 "rw_mbytes_per_sec": 0, 00:09:14.093 "r_mbytes_per_sec": 0, 00:09:14.093 "w_mbytes_per_sec": 0 00:09:14.093 }, 00:09:14.093 "claimed": true, 00:09:14.093 "claim_type": "exclusive_write", 00:09:14.093 "zoned": false, 00:09:14.093 "supported_io_types": { 00:09:14.093 "read": true, 00:09:14.093 "write": true, 00:09:14.093 "unmap": true, 00:09:14.093 "write_zeroes": true, 00:09:14.093 "flush": true, 00:09:14.093 "reset": true, 00:09:14.093 "compare": false, 00:09:14.093 "compare_and_write": false, 00:09:14.093 "abort": true, 00:09:14.093 "nvme_admin": false, 00:09:14.093 "nvme_io": false 00:09:14.093 }, 00:09:14.094 "memory_domains": [ 00:09:14.094 { 00:09:14.094 "dma_device_id": "system", 00:09:14.094 "dma_device_type": 1 00:09:14.094 }, 00:09:14.094 { 00:09:14.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:14.094 "dma_device_type": 2 00:09:14.094 } 00:09:14.094 ], 00:09:14.094 "driver_specific": {} 00:09:14.094 }' 00:09:14.094 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:14.094 04:10:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:14.094 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:14.094 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:14.094 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:14.094 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:14.094 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:14.352 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:14.610 "name": "BaseBdev2", 00:09:14.610 "aliases": [ 00:09:14.610 "bb572962-c061-48d3-ba31-1d1ac2489045" 00:09:14.610 ], 00:09:14.610 "product_name": "Malloc disk", 00:09:14.610 "block_size": 512, 00:09:14.610 "num_blocks": 65536, 00:09:14.610 "uuid": "bb572962-c061-48d3-ba31-1d1ac2489045", 00:09:14.610 "assigned_rate_limits": { 00:09:14.610 "rw_ios_per_sec": 0, 00:09:14.610 "rw_mbytes_per_sec": 0, 00:09:14.610 "r_mbytes_per_sec": 0, 00:09:14.610 "w_mbytes_per_sec": 0 00:09:14.610 }, 00:09:14.610 "claimed": true, 00:09:14.610 "claim_type": "exclusive_write", 00:09:14.610 "zoned": false, 00:09:14.610 "supported_io_types": { 00:09:14.610 "read": true, 00:09:14.610 "write": true, 00:09:14.610 "unmap": true, 00:09:14.610 "write_zeroes": true, 00:09:14.610 "flush": true, 00:09:14.610 "reset": true, 00:09:14.610 "compare": false, 00:09:14.610 "compare_and_write": false, 00:09:14.610 "abort": true, 00:09:14.610 "nvme_admin": false, 00:09:14.610 "nvme_io": false 00:09:14.610 }, 00:09:14.610 "memory_domains": [ 00:09:14.610 { 00:09:14.610 "dma_device_id": "system", 00:09:14.610 "dma_device_type": 1 00:09:14.610 }, 00:09:14.610 { 00:09:14.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:14.610 "dma_device_type": 2 00:09:14.610 } 00:09:14.610 ], 00:09:14.610 "driver_specific": {} 00:09:14.610 }' 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:14.610 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:14.868 04:10:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:15.126 [2024-05-15 04:10:02.981940] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:15.126 [2024-05-15 04:10:02.981968] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:15.126 [2024-05-15 04:10:02.982013] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:15.126 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:15.384 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:15.384 "name": "Existed_Raid", 00:09:15.384 "uuid": "697d0950-ad97-4089-b5ef-f07e85c9ed0c", 00:09:15.384 "strip_size_kb": 64, 00:09:15.384 "state": "offline", 00:09:15.384 "raid_level": "concat", 00:09:15.384 "superblock": false, 00:09:15.384 "num_base_bdevs": 2, 00:09:15.384 "num_base_bdevs_discovered": 1, 00:09:15.384 "num_base_bdevs_operational": 1, 00:09:15.384 "base_bdevs_list": [ 00:09:15.384 { 00:09:15.384 "name": null, 00:09:15.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:15.384 "is_configured": false, 00:09:15.384 "data_offset": 0, 00:09:15.384 "data_size": 65536 00:09:15.384 }, 00:09:15.384 { 00:09:15.384 "name": "BaseBdev2", 00:09:15.384 "uuid": "bb572962-c061-48d3-ba31-1d1ac2489045", 00:09:15.384 "is_configured": true, 00:09:15.384 "data_offset": 0, 00:09:15.384 "data_size": 65536 00:09:15.384 } 00:09:15.384 ] 00:09:15.384 }' 00:09:15.384 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:15.384 04:10:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:15.950 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:09:15.950 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:15.950 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:15.950 04:10:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:09:16.243 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:09:16.243 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:16.243 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:16.500 [2024-05-15 04:10:04.308012] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:16.501 [2024-05-15 04:10:04.308072] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ce6e0 name Existed_Raid, state offline 00:09:16.501 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:09:16.501 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:16.501 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:16.501 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3828994 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3828994 ']' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3828994 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3828994 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3828994' 00:09:16.759 killing process with pid 3828994 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3828994 00:09:16.759 [2024-05-15 04:10:04.615267] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:16.759 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3828994 00:09:16.759 [2024-05-15 04:10:04.616468] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:09:17.018 00:09:17.018 real 0m9.659s 00:09:17.018 user 0m17.755s 00:09:17.018 sys 0m1.411s 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:17.018 ************************************ 00:09:17.018 END TEST raid_state_function_test 00:09:17.018 ************************************ 00:09:17.018 04:10:04 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:09:17.018 04:10:04 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:17.018 04:10:04 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:17.018 04:10:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:17.018 ************************************ 00:09:17.018 START TEST raid_state_function_test_sb 00:09:17.018 ************************************ 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 2 true 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3830418 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3830418' 00:09:17.018 Process raid pid: 3830418 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3830418 /var/tmp/spdk-raid.sock 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3830418 ']' 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:17.018 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:17.018 04:10:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:17.018 [2024-05-15 04:10:05.007596] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:17.018 [2024-05-15 04:10:05.007663] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:17.277 [2024-05-15 04:10:05.083418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.277 [2024-05-15 04:10:05.188275] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:17.277 [2024-05-15 04:10:05.260347] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:17.277 [2024-05-15 04:10:05.260390] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:17.535 04:10:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:17.535 04:10:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:09:17.535 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:17.794 [2024-05-15 04:10:05.553299] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:17.794 [2024-05-15 04:10:05.553339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:17.794 [2024-05-15 04:10:05.553360] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:17.794 [2024-05-15 04:10:05.553373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:17.794 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:18.052 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:18.052 "name": "Existed_Raid", 00:09:18.052 "uuid": "61ee94b8-938b-406f-ad35-657199f74b8e", 00:09:18.052 "strip_size_kb": 64, 00:09:18.052 "state": "configuring", 00:09:18.052 "raid_level": "concat", 00:09:18.052 "superblock": true, 00:09:18.052 "num_base_bdevs": 2, 00:09:18.052 "num_base_bdevs_discovered": 0, 00:09:18.052 "num_base_bdevs_operational": 2, 00:09:18.052 "base_bdevs_list": [ 00:09:18.052 { 00:09:18.052 "name": "BaseBdev1", 00:09:18.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.052 "is_configured": false, 00:09:18.052 "data_offset": 0, 00:09:18.052 "data_size": 0 00:09:18.052 }, 00:09:18.052 { 00:09:18.052 "name": "BaseBdev2", 00:09:18.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:18.052 "is_configured": false, 00:09:18.052 "data_offset": 0, 00:09:18.052 "data_size": 0 00:09:18.052 } 00:09:18.052 ] 00:09:18.052 }' 00:09:18.052 04:10:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:18.052 04:10:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:18.618 04:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:18.618 [2024-05-15 04:10:06.583971] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:18.618 [2024-05-15 04:10:06.584000] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0b000 name Existed_Raid, state configuring 00:09:18.618 04:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:18.877 [2024-05-15 04:10:06.824610] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:18.877 [2024-05-15 04:10:06.824643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:18.877 [2024-05-15 04:10:06.824664] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:18.877 [2024-05-15 04:10:06.824677] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:18.877 04:10:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:19.135 [2024-05-15 04:10:07.077335] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:19.135 BaseBdev1 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:19.135 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:19.393 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:19.651 [ 00:09:19.651 { 00:09:19.651 "name": "BaseBdev1", 00:09:19.651 "aliases": [ 00:09:19.651 "7ec4a34a-9d9c-4f69-941b-590b11750444" 00:09:19.651 ], 00:09:19.651 "product_name": "Malloc disk", 00:09:19.651 "block_size": 512, 00:09:19.651 "num_blocks": 65536, 00:09:19.651 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:19.651 "assigned_rate_limits": { 00:09:19.651 "rw_ios_per_sec": 0, 00:09:19.651 "rw_mbytes_per_sec": 0, 00:09:19.651 "r_mbytes_per_sec": 0, 00:09:19.651 "w_mbytes_per_sec": 0 00:09:19.651 }, 00:09:19.651 "claimed": true, 00:09:19.651 "claim_type": "exclusive_write", 00:09:19.651 "zoned": false, 00:09:19.651 "supported_io_types": { 00:09:19.651 "read": true, 00:09:19.651 "write": true, 00:09:19.651 "unmap": true, 00:09:19.651 "write_zeroes": true, 00:09:19.651 "flush": true, 00:09:19.651 "reset": true, 00:09:19.651 "compare": false, 00:09:19.651 "compare_and_write": false, 00:09:19.651 "abort": true, 00:09:19.651 "nvme_admin": false, 00:09:19.651 "nvme_io": false 00:09:19.651 }, 00:09:19.651 "memory_domains": [ 00:09:19.651 { 00:09:19.651 "dma_device_id": "system", 00:09:19.651 "dma_device_type": 1 00:09:19.651 }, 00:09:19.651 { 00:09:19.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.651 "dma_device_type": 2 00:09:19.651 } 00:09:19.651 ], 00:09:19.651 "driver_specific": {} 00:09:19.651 } 00:09:19.651 ] 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:19.651 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:19.909 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:19.909 "name": "Existed_Raid", 00:09:19.909 "uuid": "f66b2910-9081-45e7-b523-9018b645fa60", 00:09:19.909 "strip_size_kb": 64, 00:09:19.909 "state": "configuring", 00:09:19.909 "raid_level": "concat", 00:09:19.909 "superblock": true, 00:09:19.909 "num_base_bdevs": 2, 00:09:19.909 "num_base_bdevs_discovered": 1, 00:09:19.909 "num_base_bdevs_operational": 2, 00:09:19.909 "base_bdevs_list": [ 00:09:19.909 { 00:09:19.909 "name": "BaseBdev1", 00:09:19.909 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:19.909 "is_configured": true, 00:09:19.909 "data_offset": 2048, 00:09:19.909 "data_size": 63488 00:09:19.909 }, 00:09:19.909 { 00:09:19.909 "name": "BaseBdev2", 00:09:19.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:19.909 "is_configured": false, 00:09:19.909 "data_offset": 0, 00:09:19.909 "data_size": 0 00:09:19.909 } 00:09:19.909 ] 00:09:19.909 }' 00:09:19.909 04:10:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:19.909 04:10:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:20.474 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:20.732 [2024-05-15 04:10:08.605357] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:20.732 [2024-05-15 04:10:08.605404] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0a8f0 name Existed_Raid, state configuring 00:09:20.732 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:20.990 [2024-05-15 04:10:08.866106] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:20.990 [2024-05-15 04:10:08.867481] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:20.990 [2024-05-15 04:10:08.867514] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:20.990 04:10:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:21.248 04:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:21.248 "name": "Existed_Raid", 00:09:21.248 "uuid": "050accc6-487e-429f-b28c-63ff45b583b3", 00:09:21.248 "strip_size_kb": 64, 00:09:21.248 "state": "configuring", 00:09:21.248 "raid_level": "concat", 00:09:21.248 "superblock": true, 00:09:21.248 "num_base_bdevs": 2, 00:09:21.248 "num_base_bdevs_discovered": 1, 00:09:21.248 "num_base_bdevs_operational": 2, 00:09:21.248 "base_bdevs_list": [ 00:09:21.248 { 00:09:21.248 "name": "BaseBdev1", 00:09:21.248 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:21.248 "is_configured": true, 00:09:21.248 "data_offset": 2048, 00:09:21.248 "data_size": 63488 00:09:21.248 }, 00:09:21.248 { 00:09:21.248 "name": "BaseBdev2", 00:09:21.248 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:21.248 "is_configured": false, 00:09:21.248 "data_offset": 0, 00:09:21.248 "data_size": 0 00:09:21.248 } 00:09:21.248 ] 00:09:21.248 }' 00:09:21.248 04:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:21.248 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:21.812 04:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:22.070 [2024-05-15 04:10:09.902845] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:22.070 [2024-05-15 04:10:09.903068] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xe0b6e0 00:09:22.070 [2024-05-15 04:10:09.903086] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:22.070 [2024-05-15 04:10:09.903277] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0ca30 00:09:22.070 [2024-05-15 04:10:09.903423] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe0b6e0 00:09:22.070 [2024-05-15 04:10:09.903439] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe0b6e0 00:09:22.070 [2024-05-15 04:10:09.903561] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:22.070 BaseBdev2 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:22.070 04:10:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:22.328 04:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:22.586 [ 00:09:22.586 { 00:09:22.586 "name": "BaseBdev2", 00:09:22.586 "aliases": [ 00:09:22.586 "58ff046b-5b9c-444a-9aee-0ee3175200de" 00:09:22.586 ], 00:09:22.586 "product_name": "Malloc disk", 00:09:22.586 "block_size": 512, 00:09:22.586 "num_blocks": 65536, 00:09:22.586 "uuid": "58ff046b-5b9c-444a-9aee-0ee3175200de", 00:09:22.586 "assigned_rate_limits": { 00:09:22.586 "rw_ios_per_sec": 0, 00:09:22.586 "rw_mbytes_per_sec": 0, 00:09:22.586 "r_mbytes_per_sec": 0, 00:09:22.586 "w_mbytes_per_sec": 0 00:09:22.586 }, 00:09:22.586 "claimed": true, 00:09:22.586 "claim_type": "exclusive_write", 00:09:22.586 "zoned": false, 00:09:22.586 "supported_io_types": { 00:09:22.586 "read": true, 00:09:22.586 "write": true, 00:09:22.586 "unmap": true, 00:09:22.586 "write_zeroes": true, 00:09:22.586 "flush": true, 00:09:22.586 "reset": true, 00:09:22.586 "compare": false, 00:09:22.586 "compare_and_write": false, 00:09:22.586 "abort": true, 00:09:22.586 "nvme_admin": false, 00:09:22.586 "nvme_io": false 00:09:22.586 }, 00:09:22.586 "memory_domains": [ 00:09:22.586 { 00:09:22.586 "dma_device_id": "system", 00:09:22.586 "dma_device_type": 1 00:09:22.586 }, 00:09:22.586 { 00:09:22.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:22.586 "dma_device_type": 2 00:09:22.586 } 00:09:22.586 ], 00:09:22.586 "driver_specific": {} 00:09:22.586 } 00:09:22.586 ] 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:22.586 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:22.845 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:22.845 "name": "Existed_Raid", 00:09:22.845 "uuid": "050accc6-487e-429f-b28c-63ff45b583b3", 00:09:22.845 "strip_size_kb": 64, 00:09:22.845 "state": "online", 00:09:22.845 "raid_level": "concat", 00:09:22.845 "superblock": true, 00:09:22.845 "num_base_bdevs": 2, 00:09:22.845 "num_base_bdevs_discovered": 2, 00:09:22.845 "num_base_bdevs_operational": 2, 00:09:22.845 "base_bdevs_list": [ 00:09:22.845 { 00:09:22.845 "name": "BaseBdev1", 00:09:22.845 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:22.845 "is_configured": true, 00:09:22.845 "data_offset": 2048, 00:09:22.845 "data_size": 63488 00:09:22.845 }, 00:09:22.845 { 00:09:22.845 "name": "BaseBdev2", 00:09:22.845 "uuid": "58ff046b-5b9c-444a-9aee-0ee3175200de", 00:09:22.845 "is_configured": true, 00:09:22.845 "data_offset": 2048, 00:09:22.845 "data_size": 63488 00:09:22.845 } 00:09:22.845 ] 00:09:22.845 }' 00:09:22.845 04:10:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:22.845 04:10:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:23.409 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:23.675 [2024-05-15 04:10:11.447142] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:23.675 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:23.675 "name": "Existed_Raid", 00:09:23.675 "aliases": [ 00:09:23.675 "050accc6-487e-429f-b28c-63ff45b583b3" 00:09:23.675 ], 00:09:23.675 "product_name": "Raid Volume", 00:09:23.675 "block_size": 512, 00:09:23.675 "num_blocks": 126976, 00:09:23.675 "uuid": "050accc6-487e-429f-b28c-63ff45b583b3", 00:09:23.675 "assigned_rate_limits": { 00:09:23.675 "rw_ios_per_sec": 0, 00:09:23.675 "rw_mbytes_per_sec": 0, 00:09:23.675 "r_mbytes_per_sec": 0, 00:09:23.675 "w_mbytes_per_sec": 0 00:09:23.675 }, 00:09:23.675 "claimed": false, 00:09:23.675 "zoned": false, 00:09:23.675 "supported_io_types": { 00:09:23.675 "read": true, 00:09:23.675 "write": true, 00:09:23.675 "unmap": true, 00:09:23.675 "write_zeroes": true, 00:09:23.675 "flush": true, 00:09:23.675 "reset": true, 00:09:23.675 "compare": false, 00:09:23.675 "compare_and_write": false, 00:09:23.675 "abort": false, 00:09:23.675 "nvme_admin": false, 00:09:23.675 "nvme_io": false 00:09:23.675 }, 00:09:23.675 "memory_domains": [ 00:09:23.675 { 00:09:23.675 "dma_device_id": "system", 00:09:23.675 "dma_device_type": 1 00:09:23.675 }, 00:09:23.675 { 00:09:23.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.675 "dma_device_type": 2 00:09:23.675 }, 00:09:23.675 { 00:09:23.675 "dma_device_id": "system", 00:09:23.675 "dma_device_type": 1 00:09:23.675 }, 00:09:23.675 { 00:09:23.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.675 "dma_device_type": 2 00:09:23.675 } 00:09:23.675 ], 00:09:23.675 "driver_specific": { 00:09:23.675 "raid": { 00:09:23.675 "uuid": "050accc6-487e-429f-b28c-63ff45b583b3", 00:09:23.675 "strip_size_kb": 64, 00:09:23.675 "state": "online", 00:09:23.675 "raid_level": "concat", 00:09:23.675 "superblock": true, 00:09:23.675 "num_base_bdevs": 2, 00:09:23.675 "num_base_bdevs_discovered": 2, 00:09:23.675 "num_base_bdevs_operational": 2, 00:09:23.675 "base_bdevs_list": [ 00:09:23.675 { 00:09:23.675 "name": "BaseBdev1", 00:09:23.675 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:23.675 "is_configured": true, 00:09:23.675 "data_offset": 2048, 00:09:23.675 "data_size": 63488 00:09:23.675 }, 00:09:23.675 { 00:09:23.675 "name": "BaseBdev2", 00:09:23.675 "uuid": "58ff046b-5b9c-444a-9aee-0ee3175200de", 00:09:23.675 "is_configured": true, 00:09:23.675 "data_offset": 2048, 00:09:23.675 "data_size": 63488 00:09:23.675 } 00:09:23.675 ] 00:09:23.675 } 00:09:23.675 } 00:09:23.675 }' 00:09:23.675 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:23.676 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:09:23.676 BaseBdev2' 00:09:23.676 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:23.676 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:23.676 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:23.938 "name": "BaseBdev1", 00:09:23.938 "aliases": [ 00:09:23.938 "7ec4a34a-9d9c-4f69-941b-590b11750444" 00:09:23.938 ], 00:09:23.938 "product_name": "Malloc disk", 00:09:23.938 "block_size": 512, 00:09:23.938 "num_blocks": 65536, 00:09:23.938 "uuid": "7ec4a34a-9d9c-4f69-941b-590b11750444", 00:09:23.938 "assigned_rate_limits": { 00:09:23.938 "rw_ios_per_sec": 0, 00:09:23.938 "rw_mbytes_per_sec": 0, 00:09:23.938 "r_mbytes_per_sec": 0, 00:09:23.938 "w_mbytes_per_sec": 0 00:09:23.938 }, 00:09:23.938 "claimed": true, 00:09:23.938 "claim_type": "exclusive_write", 00:09:23.938 "zoned": false, 00:09:23.938 "supported_io_types": { 00:09:23.938 "read": true, 00:09:23.938 "write": true, 00:09:23.938 "unmap": true, 00:09:23.938 "write_zeroes": true, 00:09:23.938 "flush": true, 00:09:23.938 "reset": true, 00:09:23.938 "compare": false, 00:09:23.938 "compare_and_write": false, 00:09:23.938 "abort": true, 00:09:23.938 "nvme_admin": false, 00:09:23.938 "nvme_io": false 00:09:23.938 }, 00:09:23.938 "memory_domains": [ 00:09:23.938 { 00:09:23.938 "dma_device_id": "system", 00:09:23.938 "dma_device_type": 1 00:09:23.938 }, 00:09:23.938 { 00:09:23.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:23.938 "dma_device_type": 2 00:09:23.938 } 00:09:23.938 ], 00:09:23.938 "driver_specific": {} 00:09:23.938 }' 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:23.938 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:24.195 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:24.195 04:10:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:24.195 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:24.195 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:24.196 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:24.196 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:24.196 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:24.453 "name": "BaseBdev2", 00:09:24.453 "aliases": [ 00:09:24.453 "58ff046b-5b9c-444a-9aee-0ee3175200de" 00:09:24.453 ], 00:09:24.453 "product_name": "Malloc disk", 00:09:24.453 "block_size": 512, 00:09:24.453 "num_blocks": 65536, 00:09:24.453 "uuid": "58ff046b-5b9c-444a-9aee-0ee3175200de", 00:09:24.453 "assigned_rate_limits": { 00:09:24.453 "rw_ios_per_sec": 0, 00:09:24.453 "rw_mbytes_per_sec": 0, 00:09:24.453 "r_mbytes_per_sec": 0, 00:09:24.453 "w_mbytes_per_sec": 0 00:09:24.453 }, 00:09:24.453 "claimed": true, 00:09:24.453 "claim_type": "exclusive_write", 00:09:24.453 "zoned": false, 00:09:24.453 "supported_io_types": { 00:09:24.453 "read": true, 00:09:24.453 "write": true, 00:09:24.453 "unmap": true, 00:09:24.453 "write_zeroes": true, 00:09:24.453 "flush": true, 00:09:24.453 "reset": true, 00:09:24.453 "compare": false, 00:09:24.453 "compare_and_write": false, 00:09:24.453 "abort": true, 00:09:24.453 "nvme_admin": false, 00:09:24.453 "nvme_io": false 00:09:24.453 }, 00:09:24.453 "memory_domains": [ 00:09:24.453 { 00:09:24.453 "dma_device_id": "system", 00:09:24.453 "dma_device_type": 1 00:09:24.453 }, 00:09:24.453 { 00:09:24.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:24.453 "dma_device_type": 2 00:09:24.453 } 00:09:24.453 ], 00:09:24.453 "driver_specific": {} 00:09:24.453 }' 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:24.453 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:24.711 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:24.711 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:24.711 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:24.711 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:24.711 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:24.970 [2024-05-15 04:10:12.790563] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:24.970 [2024-05-15 04:10:12.790586] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:24.970 [2024-05-15 04:10:12.790628] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:24.970 04:10:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:25.228 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:25.228 "name": "Existed_Raid", 00:09:25.228 "uuid": "050accc6-487e-429f-b28c-63ff45b583b3", 00:09:25.228 "strip_size_kb": 64, 00:09:25.228 "state": "offline", 00:09:25.228 "raid_level": "concat", 00:09:25.228 "superblock": true, 00:09:25.228 "num_base_bdevs": 2, 00:09:25.228 "num_base_bdevs_discovered": 1, 00:09:25.228 "num_base_bdevs_operational": 1, 00:09:25.228 "base_bdevs_list": [ 00:09:25.228 { 00:09:25.228 "name": null, 00:09:25.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:25.228 "is_configured": false, 00:09:25.228 "data_offset": 2048, 00:09:25.228 "data_size": 63488 00:09:25.228 }, 00:09:25.228 { 00:09:25.228 "name": "BaseBdev2", 00:09:25.228 "uuid": "58ff046b-5b9c-444a-9aee-0ee3175200de", 00:09:25.228 "is_configured": true, 00:09:25.228 "data_offset": 2048, 00:09:25.228 "data_size": 63488 00:09:25.228 } 00:09:25.228 ] 00:09:25.228 }' 00:09:25.228 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:25.228 04:10:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:25.793 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:09:25.793 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:25.793 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:25.793 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:09:26.051 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:09:26.051 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:26.051 04:10:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:26.309 [2024-05-15 04:10:14.111634] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:26.309 [2024-05-15 04:10:14.111694] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0b6e0 name Existed_Raid, state offline 00:09:26.309 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:09:26.309 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:26.309 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:26.309 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3830418 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3830418 ']' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3830418 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3830418 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3830418' 00:09:26.567 killing process with pid 3830418 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3830418 00:09:26.567 [2024-05-15 04:10:14.403181] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:26.567 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3830418 00:09:26.567 [2024-05-15 04:10:14.404336] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:26.826 04:10:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:09:26.826 00:09:26.826 real 0m9.712s 00:09:26.826 user 0m17.890s 00:09:26.826 sys 0m1.405s 00:09:26.826 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:26.826 04:10:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:26.826 ************************************ 00:09:26.826 END TEST raid_state_function_test_sb 00:09:26.826 ************************************ 00:09:26.826 04:10:14 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:09:26.826 04:10:14 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:09:26.826 04:10:14 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:26.826 04:10:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:26.826 ************************************ 00:09:26.826 START TEST raid_superblock_test 00:09:26.826 ************************************ 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 2 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3831840 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3831840 /var/tmp/spdk-raid.sock 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3831840 ']' 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:26.826 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:26.826 04:10:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:26.826 [2024-05-15 04:10:14.768400] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:26.826 [2024-05-15 04:10:14.768473] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3831840 ] 00:09:27.084 [2024-05-15 04:10:14.845283] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.084 [2024-05-15 04:10:14.954931] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.084 [2024-05-15 04:10:15.026234] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:27.084 [2024-05-15 04:10:15.026280] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:28.017 malloc1 00:09:28.017 04:10:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:28.275 [2024-05-15 04:10:16.198829] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:28.275 [2024-05-15 04:10:16.198887] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:28.275 [2024-05-15 04:10:16.198918] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220ec20 00:09:28.275 [2024-05-15 04:10:16.198932] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:28.275 [2024-05-15 04:10:16.200810] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:28.275 [2024-05-15 04:10:16.200850] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:28.275 pt1 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:28.275 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:28.534 malloc2 00:09:28.534 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:28.792 [2024-05-15 04:10:16.691892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:28.792 [2024-05-15 04:10:16.691946] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:28.792 [2024-05-15 04:10:16.691969] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2206c00 00:09:28.792 [2024-05-15 04:10:16.691981] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:28.792 [2024-05-15 04:10:16.693684] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:28.792 [2024-05-15 04:10:16.693708] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:28.792 pt2 00:09:28.792 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:09:28.792 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:28.792 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:09:29.050 [2024-05-15 04:10:16.940576] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:29.050 [2024-05-15 04:10:16.941867] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:29.050 [2024-05-15 04:10:16.942029] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x2207230 00:09:29.050 [2024-05-15 04:10:16.942044] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:29.050 [2024-05-15 04:10:16.942263] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2225b10 00:09:29.050 [2024-05-15 04:10:16.942417] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2207230 00:09:29.050 [2024-05-15 04:10:16.942430] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2207230 00:09:29.050 [2024-05-15 04:10:16.942556] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:29.050 04:10:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:29.307 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:29.307 "name": "raid_bdev1", 00:09:29.307 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:29.307 "strip_size_kb": 64, 00:09:29.307 "state": "online", 00:09:29.307 "raid_level": "concat", 00:09:29.307 "superblock": true, 00:09:29.307 "num_base_bdevs": 2, 00:09:29.307 "num_base_bdevs_discovered": 2, 00:09:29.307 "num_base_bdevs_operational": 2, 00:09:29.307 "base_bdevs_list": [ 00:09:29.307 { 00:09:29.307 "name": "pt1", 00:09:29.307 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:29.307 "is_configured": true, 00:09:29.307 "data_offset": 2048, 00:09:29.307 "data_size": 63488 00:09:29.307 }, 00:09:29.307 { 00:09:29.307 "name": "pt2", 00:09:29.307 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:29.307 "is_configured": true, 00:09:29.307 "data_offset": 2048, 00:09:29.307 "data_size": 63488 00:09:29.307 } 00:09:29.307 ] 00:09:29.307 }' 00:09:29.308 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:29.308 04:10:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:09:29.872 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:29.873 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:30.131 [2024-05-15 04:10:17.979507] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:30.131 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:30.131 "name": "raid_bdev1", 00:09:30.131 "aliases": [ 00:09:30.131 "752cdd25-e703-4d71-bc07-ec20cf8ab467" 00:09:30.131 ], 00:09:30.131 "product_name": "Raid Volume", 00:09:30.131 "block_size": 512, 00:09:30.131 "num_blocks": 126976, 00:09:30.131 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:30.131 "assigned_rate_limits": { 00:09:30.131 "rw_ios_per_sec": 0, 00:09:30.131 "rw_mbytes_per_sec": 0, 00:09:30.131 "r_mbytes_per_sec": 0, 00:09:30.131 "w_mbytes_per_sec": 0 00:09:30.131 }, 00:09:30.131 "claimed": false, 00:09:30.131 "zoned": false, 00:09:30.131 "supported_io_types": { 00:09:30.131 "read": true, 00:09:30.131 "write": true, 00:09:30.131 "unmap": true, 00:09:30.131 "write_zeroes": true, 00:09:30.131 "flush": true, 00:09:30.131 "reset": true, 00:09:30.131 "compare": false, 00:09:30.131 "compare_and_write": false, 00:09:30.131 "abort": false, 00:09:30.131 "nvme_admin": false, 00:09:30.131 "nvme_io": false 00:09:30.131 }, 00:09:30.131 "memory_domains": [ 00:09:30.131 { 00:09:30.131 "dma_device_id": "system", 00:09:30.131 "dma_device_type": 1 00:09:30.131 }, 00:09:30.131 { 00:09:30.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.131 "dma_device_type": 2 00:09:30.131 }, 00:09:30.131 { 00:09:30.131 "dma_device_id": "system", 00:09:30.131 "dma_device_type": 1 00:09:30.131 }, 00:09:30.131 { 00:09:30.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.131 "dma_device_type": 2 00:09:30.131 } 00:09:30.131 ], 00:09:30.131 "driver_specific": { 00:09:30.131 "raid": { 00:09:30.131 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:30.131 "strip_size_kb": 64, 00:09:30.131 "state": "online", 00:09:30.131 "raid_level": "concat", 00:09:30.131 "superblock": true, 00:09:30.131 "num_base_bdevs": 2, 00:09:30.131 "num_base_bdevs_discovered": 2, 00:09:30.131 "num_base_bdevs_operational": 2, 00:09:30.131 "base_bdevs_list": [ 00:09:30.131 { 00:09:30.131 "name": "pt1", 00:09:30.131 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:30.131 "is_configured": true, 00:09:30.131 "data_offset": 2048, 00:09:30.131 "data_size": 63488 00:09:30.131 }, 00:09:30.131 { 00:09:30.131 "name": "pt2", 00:09:30.131 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:30.131 "is_configured": true, 00:09:30.131 "data_offset": 2048, 00:09:30.131 "data_size": 63488 00:09:30.131 } 00:09:30.131 ] 00:09:30.131 } 00:09:30.131 } 00:09:30.131 }' 00:09:30.131 04:10:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:30.131 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:09:30.131 pt2' 00:09:30.131 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:30.131 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:30.131 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:30.389 "name": "pt1", 00:09:30.389 "aliases": [ 00:09:30.389 "9665e4ef-3496-5f55-8b3a-d464a2be8ebb" 00:09:30.389 ], 00:09:30.389 "product_name": "passthru", 00:09:30.389 "block_size": 512, 00:09:30.389 "num_blocks": 65536, 00:09:30.389 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:30.389 "assigned_rate_limits": { 00:09:30.389 "rw_ios_per_sec": 0, 00:09:30.389 "rw_mbytes_per_sec": 0, 00:09:30.389 "r_mbytes_per_sec": 0, 00:09:30.389 "w_mbytes_per_sec": 0 00:09:30.389 }, 00:09:30.389 "claimed": true, 00:09:30.389 "claim_type": "exclusive_write", 00:09:30.389 "zoned": false, 00:09:30.389 "supported_io_types": { 00:09:30.389 "read": true, 00:09:30.389 "write": true, 00:09:30.389 "unmap": true, 00:09:30.389 "write_zeroes": true, 00:09:30.389 "flush": true, 00:09:30.389 "reset": true, 00:09:30.389 "compare": false, 00:09:30.389 "compare_and_write": false, 00:09:30.389 "abort": true, 00:09:30.389 "nvme_admin": false, 00:09:30.389 "nvme_io": false 00:09:30.389 }, 00:09:30.389 "memory_domains": [ 00:09:30.389 { 00:09:30.389 "dma_device_id": "system", 00:09:30.389 "dma_device_type": 1 00:09:30.389 }, 00:09:30.389 { 00:09:30.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.389 "dma_device_type": 2 00:09:30.389 } 00:09:30.389 ], 00:09:30.389 "driver_specific": { 00:09:30.389 "passthru": { 00:09:30.389 "name": "pt1", 00:09:30.389 "base_bdev_name": "malloc1" 00:09:30.389 } 00:09:30.389 } 00:09:30.389 }' 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:30.389 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:30.646 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:30.904 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:30.904 "name": "pt2", 00:09:30.904 "aliases": [ 00:09:30.904 "fa88afbc-b391-5f4f-ba75-aacccbfae2de" 00:09:30.904 ], 00:09:30.904 "product_name": "passthru", 00:09:30.904 "block_size": 512, 00:09:30.904 "num_blocks": 65536, 00:09:30.904 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:30.904 "assigned_rate_limits": { 00:09:30.904 "rw_ios_per_sec": 0, 00:09:30.904 "rw_mbytes_per_sec": 0, 00:09:30.904 "r_mbytes_per_sec": 0, 00:09:30.904 "w_mbytes_per_sec": 0 00:09:30.904 }, 00:09:30.904 "claimed": true, 00:09:30.904 "claim_type": "exclusive_write", 00:09:30.904 "zoned": false, 00:09:30.904 "supported_io_types": { 00:09:30.904 "read": true, 00:09:30.904 "write": true, 00:09:30.904 "unmap": true, 00:09:30.904 "write_zeroes": true, 00:09:30.904 "flush": true, 00:09:30.904 "reset": true, 00:09:30.904 "compare": false, 00:09:30.904 "compare_and_write": false, 00:09:30.904 "abort": true, 00:09:30.904 "nvme_admin": false, 00:09:30.904 "nvme_io": false 00:09:30.904 }, 00:09:30.904 "memory_domains": [ 00:09:30.904 { 00:09:30.904 "dma_device_id": "system", 00:09:30.904 "dma_device_type": 1 00:09:30.904 }, 00:09:30.904 { 00:09:30.904 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:30.905 "dma_device_type": 2 00:09:30.905 } 00:09:30.905 ], 00:09:30.905 "driver_specific": { 00:09:30.905 "passthru": { 00:09:30.905 "name": "pt2", 00:09:30.905 "base_bdev_name": "malloc2" 00:09:30.905 } 00:09:30.905 } 00:09:30.905 }' 00:09:30.905 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:30.905 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:30.905 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:30.905 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:30.905 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:31.163 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:31.163 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:31.163 04:10:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:31.163 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:09:31.422 [2024-05-15 04:10:19.335089] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:31.422 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=752cdd25-e703-4d71-bc07-ec20cf8ab467 00:09:31.422 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 752cdd25-e703-4d71-bc07-ec20cf8ab467 ']' 00:09:31.422 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:09:31.680 [2024-05-15 04:10:19.567483] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:31.680 [2024-05-15 04:10:19.567506] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:31.680 [2024-05-15 04:10:19.567579] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:31.680 [2024-05-15 04:10:19.567635] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:31.680 [2024-05-15 04:10:19.567650] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2207230 name raid_bdev1, state offline 00:09:31.680 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:31.680 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:09:31.938 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:09:31.938 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:09:31.938 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:09:31.938 04:10:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:09:32.196 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:09:32.196 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:09:32.454 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:09:32.454 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:32.713 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:09:32.971 [2024-05-15 04:10:20.818839] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:09:32.971 [2024-05-15 04:10:20.820258] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:09:32.971 [2024-05-15 04:10:20.820326] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:09:32.971 [2024-05-15 04:10:20.820392] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:09:32.971 [2024-05-15 04:10:20.820418] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:09:32.971 [2024-05-15 04:10:20.820431] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220f800 name raid_bdev1, state configuring 00:09:32.971 request: 00:09:32.971 { 00:09:32.971 "name": "raid_bdev1", 00:09:32.971 "raid_level": "concat", 00:09:32.971 "base_bdevs": [ 00:09:32.971 "malloc1", 00:09:32.971 "malloc2" 00:09:32.971 ], 00:09:32.971 "superblock": false, 00:09:32.971 "strip_size_kb": 64, 00:09:32.971 "method": "bdev_raid_create", 00:09:32.971 "req_id": 1 00:09:32.971 } 00:09:32.971 Got JSON-RPC error response 00:09:32.971 response: 00:09:32.971 { 00:09:32.971 "code": -17, 00:09:32.971 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:09:32.971 } 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:32.971 04:10:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:09:33.228 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:09:33.228 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:09:33.228 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:33.512 [2024-05-15 04:10:21.304075] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:33.512 [2024-05-15 04:10:21.304138] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:33.512 [2024-05-15 04:10:21.304187] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x220f0e0 00:09:33.512 [2024-05-15 04:10:21.304201] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:33.512 [2024-05-15 04:10:21.305800] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:33.512 [2024-05-15 04:10:21.305839] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:33.512 [2024-05-15 04:10:21.305943] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:09:33.512 [2024-05-15 04:10:21.305982] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:33.512 pt1 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:33.512 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:33.770 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:33.770 "name": "raid_bdev1", 00:09:33.770 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:33.770 "strip_size_kb": 64, 00:09:33.770 "state": "configuring", 00:09:33.770 "raid_level": "concat", 00:09:33.770 "superblock": true, 00:09:33.770 "num_base_bdevs": 2, 00:09:33.770 "num_base_bdevs_discovered": 1, 00:09:33.770 "num_base_bdevs_operational": 2, 00:09:33.770 "base_bdevs_list": [ 00:09:33.770 { 00:09:33.770 "name": "pt1", 00:09:33.770 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:33.770 "is_configured": true, 00:09:33.770 "data_offset": 2048, 00:09:33.770 "data_size": 63488 00:09:33.770 }, 00:09:33.770 { 00:09:33.770 "name": null, 00:09:33.770 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:33.770 "is_configured": false, 00:09:33.770 "data_offset": 2048, 00:09:33.770 "data_size": 63488 00:09:33.770 } 00:09:33.770 ] 00:09:33.770 }' 00:09:33.770 04:10:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:33.770 04:10:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:34.336 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:09:34.336 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:09:34.336 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:09:34.336 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:34.336 [2024-05-15 04:10:22.342885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:34.336 [2024-05-15 04:10:22.342945] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:34.336 [2024-05-15 04:10:22.342974] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2208650 00:09:34.336 [2024-05-15 04:10:22.342990] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:34.336 [2024-05-15 04:10:22.343404] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:34.336 [2024-05-15 04:10:22.343430] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:34.336 [2024-05-15 04:10:22.343515] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:09:34.336 [2024-05-15 04:10:22.343563] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:34.336 [2024-05-15 04:10:22.343697] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x220c840 00:09:34.336 [2024-05-15 04:10:22.343714] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:34.336 [2024-05-15 04:10:22.343899] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2205f60 00:09:34.336 [2024-05-15 04:10:22.344047] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x220c840 00:09:34.336 [2024-05-15 04:10:22.344063] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x220c840 00:09:34.336 [2024-05-15 04:10:22.344175] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:34.336 pt2 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:34.594 "name": "raid_bdev1", 00:09:34.594 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:34.594 "strip_size_kb": 64, 00:09:34.594 "state": "online", 00:09:34.594 "raid_level": "concat", 00:09:34.594 "superblock": true, 00:09:34.594 "num_base_bdevs": 2, 00:09:34.594 "num_base_bdevs_discovered": 2, 00:09:34.594 "num_base_bdevs_operational": 2, 00:09:34.594 "base_bdevs_list": [ 00:09:34.594 { 00:09:34.594 "name": "pt1", 00:09:34.594 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:34.594 "is_configured": true, 00:09:34.594 "data_offset": 2048, 00:09:34.594 "data_size": 63488 00:09:34.594 }, 00:09:34.594 { 00:09:34.594 "name": "pt2", 00:09:34.594 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:34.594 "is_configured": true, 00:09:34.594 "data_offset": 2048, 00:09:34.594 "data_size": 63488 00:09:34.594 } 00:09:34.594 ] 00:09:34.594 }' 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:34.594 04:10:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:35.160 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:35.418 [2024-05-15 04:10:23.349753] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:35.418 "name": "raid_bdev1", 00:09:35.418 "aliases": [ 00:09:35.418 "752cdd25-e703-4d71-bc07-ec20cf8ab467" 00:09:35.418 ], 00:09:35.418 "product_name": "Raid Volume", 00:09:35.418 "block_size": 512, 00:09:35.418 "num_blocks": 126976, 00:09:35.418 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:35.418 "assigned_rate_limits": { 00:09:35.418 "rw_ios_per_sec": 0, 00:09:35.418 "rw_mbytes_per_sec": 0, 00:09:35.418 "r_mbytes_per_sec": 0, 00:09:35.418 "w_mbytes_per_sec": 0 00:09:35.418 }, 00:09:35.418 "claimed": false, 00:09:35.418 "zoned": false, 00:09:35.418 "supported_io_types": { 00:09:35.418 "read": true, 00:09:35.418 "write": true, 00:09:35.418 "unmap": true, 00:09:35.418 "write_zeroes": true, 00:09:35.418 "flush": true, 00:09:35.418 "reset": true, 00:09:35.418 "compare": false, 00:09:35.418 "compare_and_write": false, 00:09:35.418 "abort": false, 00:09:35.418 "nvme_admin": false, 00:09:35.418 "nvme_io": false 00:09:35.418 }, 00:09:35.418 "memory_domains": [ 00:09:35.418 { 00:09:35.418 "dma_device_id": "system", 00:09:35.418 "dma_device_type": 1 00:09:35.418 }, 00:09:35.418 { 00:09:35.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.418 "dma_device_type": 2 00:09:35.418 }, 00:09:35.418 { 00:09:35.418 "dma_device_id": "system", 00:09:35.418 "dma_device_type": 1 00:09:35.418 }, 00:09:35.418 { 00:09:35.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.418 "dma_device_type": 2 00:09:35.418 } 00:09:35.418 ], 00:09:35.418 "driver_specific": { 00:09:35.418 "raid": { 00:09:35.418 "uuid": "752cdd25-e703-4d71-bc07-ec20cf8ab467", 00:09:35.418 "strip_size_kb": 64, 00:09:35.418 "state": "online", 00:09:35.418 "raid_level": "concat", 00:09:35.418 "superblock": true, 00:09:35.418 "num_base_bdevs": 2, 00:09:35.418 "num_base_bdevs_discovered": 2, 00:09:35.418 "num_base_bdevs_operational": 2, 00:09:35.418 "base_bdevs_list": [ 00:09:35.418 { 00:09:35.418 "name": "pt1", 00:09:35.418 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:35.418 "is_configured": true, 00:09:35.418 "data_offset": 2048, 00:09:35.418 "data_size": 63488 00:09:35.418 }, 00:09:35.418 { 00:09:35.418 "name": "pt2", 00:09:35.418 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:35.418 "is_configured": true, 00:09:35.418 "data_offset": 2048, 00:09:35.418 "data_size": 63488 00:09:35.418 } 00:09:35.418 ] 00:09:35.418 } 00:09:35.418 } 00:09:35.418 }' 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:09:35.418 pt2' 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:35.418 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:35.675 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:35.675 "name": "pt1", 00:09:35.675 "aliases": [ 00:09:35.675 "9665e4ef-3496-5f55-8b3a-d464a2be8ebb" 00:09:35.675 ], 00:09:35.675 "product_name": "passthru", 00:09:35.675 "block_size": 512, 00:09:35.675 "num_blocks": 65536, 00:09:35.675 "uuid": "9665e4ef-3496-5f55-8b3a-d464a2be8ebb", 00:09:35.675 "assigned_rate_limits": { 00:09:35.675 "rw_ios_per_sec": 0, 00:09:35.675 "rw_mbytes_per_sec": 0, 00:09:35.675 "r_mbytes_per_sec": 0, 00:09:35.675 "w_mbytes_per_sec": 0 00:09:35.675 }, 00:09:35.675 "claimed": true, 00:09:35.675 "claim_type": "exclusive_write", 00:09:35.675 "zoned": false, 00:09:35.675 "supported_io_types": { 00:09:35.675 "read": true, 00:09:35.675 "write": true, 00:09:35.675 "unmap": true, 00:09:35.675 "write_zeroes": true, 00:09:35.675 "flush": true, 00:09:35.675 "reset": true, 00:09:35.675 "compare": false, 00:09:35.675 "compare_and_write": false, 00:09:35.675 "abort": true, 00:09:35.675 "nvme_admin": false, 00:09:35.675 "nvme_io": false 00:09:35.675 }, 00:09:35.675 "memory_domains": [ 00:09:35.675 { 00:09:35.675 "dma_device_id": "system", 00:09:35.675 "dma_device_type": 1 00:09:35.675 }, 00:09:35.675 { 00:09:35.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:35.675 "dma_device_type": 2 00:09:35.675 } 00:09:35.675 ], 00:09:35.675 "driver_specific": { 00:09:35.675 "passthru": { 00:09:35.675 "name": "pt1", 00:09:35.675 "base_bdev_name": "malloc1" 00:09:35.675 } 00:09:35.675 } 00:09:35.675 }' 00:09:35.675 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:35.932 04:10:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:36.190 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:36.190 "name": "pt2", 00:09:36.190 "aliases": [ 00:09:36.190 "fa88afbc-b391-5f4f-ba75-aacccbfae2de" 00:09:36.190 ], 00:09:36.190 "product_name": "passthru", 00:09:36.190 "block_size": 512, 00:09:36.190 "num_blocks": 65536, 00:09:36.190 "uuid": "fa88afbc-b391-5f4f-ba75-aacccbfae2de", 00:09:36.190 "assigned_rate_limits": { 00:09:36.190 "rw_ios_per_sec": 0, 00:09:36.190 "rw_mbytes_per_sec": 0, 00:09:36.190 "r_mbytes_per_sec": 0, 00:09:36.190 "w_mbytes_per_sec": 0 00:09:36.190 }, 00:09:36.190 "claimed": true, 00:09:36.190 "claim_type": "exclusive_write", 00:09:36.190 "zoned": false, 00:09:36.190 "supported_io_types": { 00:09:36.190 "read": true, 00:09:36.190 "write": true, 00:09:36.190 "unmap": true, 00:09:36.190 "write_zeroes": true, 00:09:36.190 "flush": true, 00:09:36.190 "reset": true, 00:09:36.190 "compare": false, 00:09:36.190 "compare_and_write": false, 00:09:36.190 "abort": true, 00:09:36.190 "nvme_admin": false, 00:09:36.190 "nvme_io": false 00:09:36.190 }, 00:09:36.190 "memory_domains": [ 00:09:36.190 { 00:09:36.190 "dma_device_id": "system", 00:09:36.190 "dma_device_type": 1 00:09:36.190 }, 00:09:36.190 { 00:09:36.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:36.190 "dma_device_type": 2 00:09:36.190 } 00:09:36.190 ], 00:09:36.190 "driver_specific": { 00:09:36.190 "passthru": { 00:09:36.190 "name": "pt2", 00:09:36.190 "base_bdev_name": "malloc2" 00:09:36.190 } 00:09:36.190 } 00:09:36.190 }' 00:09:36.190 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:36.449 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:36.707 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:36.707 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:36.707 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:09:36.707 [2024-05-15 04:10:24.713377] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 752cdd25-e703-4d71-bc07-ec20cf8ab467 '!=' 752cdd25-e703-4d71-bc07-ec20cf8ab467 ']' 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3831840 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3831840 ']' 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3831840 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3831840 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3831840' 00:09:36.965 killing process with pid 3831840 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3831840 00:09:36.965 [2024-05-15 04:10:24.763779] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:36.965 04:10:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3831840 00:09:36.965 [2024-05-15 04:10:24.763884] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:36.965 [2024-05-15 04:10:24.763942] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:36.965 [2024-05-15 04:10:24.763956] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x220c840 name raid_bdev1, state offline 00:09:36.965 [2024-05-15 04:10:24.782845] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:37.223 04:10:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:09:37.223 00:09:37.223 real 0m10.328s 00:09:37.223 user 0m18.767s 00:09:37.223 sys 0m1.433s 00:09:37.223 04:10:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:37.223 04:10:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:37.223 ************************************ 00:09:37.223 END TEST raid_superblock_test 00:09:37.223 ************************************ 00:09:37.223 04:10:25 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:09:37.223 04:10:25 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:09:37.223 04:10:25 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:37.223 04:10:25 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:37.223 04:10:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:37.223 ************************************ 00:09:37.223 START TEST raid_state_function_test 00:09:37.223 ************************************ 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 false 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3833263 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3833263' 00:09:37.223 Process raid pid: 3833263 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3833263 /var/tmp/spdk-raid.sock 00:09:37.223 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3833263 ']' 00:09:37.224 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:37.224 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:37.224 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:37.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:37.224 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:37.224 04:10:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:37.224 [2024-05-15 04:10:25.157324] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:37.224 [2024-05-15 04:10:25.157405] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:37.482 [2024-05-15 04:10:25.240515] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:37.482 [2024-05-15 04:10:25.357071] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:37.482 [2024-05-15 04:10:25.430444] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:37.482 [2024-05-15 04:10:25.430494] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:38.415 04:10:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:38.415 04:10:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:38.416 [2024-05-15 04:10:26.301993] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:38.416 [2024-05-15 04:10:26.302041] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:38.416 [2024-05-15 04:10:26.302063] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:38.416 [2024-05-15 04:10:26.302076] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.416 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:38.674 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:38.674 "name": "Existed_Raid", 00:09:38.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.674 "strip_size_kb": 0, 00:09:38.674 "state": "configuring", 00:09:38.674 "raid_level": "raid1", 00:09:38.674 "superblock": false, 00:09:38.674 "num_base_bdevs": 2, 00:09:38.674 "num_base_bdevs_discovered": 0, 00:09:38.674 "num_base_bdevs_operational": 2, 00:09:38.674 "base_bdevs_list": [ 00:09:38.674 { 00:09:38.674 "name": "BaseBdev1", 00:09:38.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.674 "is_configured": false, 00:09:38.674 "data_offset": 0, 00:09:38.674 "data_size": 0 00:09:38.674 }, 00:09:38.674 { 00:09:38.674 "name": "BaseBdev2", 00:09:38.674 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.674 "is_configured": false, 00:09:38.674 "data_offset": 0, 00:09:38.674 "data_size": 0 00:09:38.674 } 00:09:38.674 ] 00:09:38.674 }' 00:09:38.674 04:10:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:38.674 04:10:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.238 04:10:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:39.495 [2024-05-15 04:10:27.332669] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:39.495 [2024-05-15 04:10:27.332705] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a27000 name Existed_Raid, state configuring 00:09:39.495 04:10:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:39.752 [2024-05-15 04:10:27.573302] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:39.752 [2024-05-15 04:10:27.573339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:39.752 [2024-05-15 04:10:27.573351] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:39.752 [2024-05-15 04:10:27.573363] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:39.752 04:10:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:40.009 [2024-05-15 04:10:27.821806] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:40.009 BaseBdev1 00:09:40.009 04:10:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:09:40.009 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:09:40.009 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:40.009 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:09:40.010 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:40.010 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:40.010 04:10:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:40.268 04:10:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:40.525 [ 00:09:40.525 { 00:09:40.525 "name": "BaseBdev1", 00:09:40.525 "aliases": [ 00:09:40.525 "0ee5a744-f912-4c3c-88cb-0c991051f4d6" 00:09:40.525 ], 00:09:40.525 "product_name": "Malloc disk", 00:09:40.525 "block_size": 512, 00:09:40.525 "num_blocks": 65536, 00:09:40.525 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:40.525 "assigned_rate_limits": { 00:09:40.525 "rw_ios_per_sec": 0, 00:09:40.525 "rw_mbytes_per_sec": 0, 00:09:40.525 "r_mbytes_per_sec": 0, 00:09:40.525 "w_mbytes_per_sec": 0 00:09:40.525 }, 00:09:40.525 "claimed": true, 00:09:40.525 "claim_type": "exclusive_write", 00:09:40.525 "zoned": false, 00:09:40.525 "supported_io_types": { 00:09:40.525 "read": true, 00:09:40.525 "write": true, 00:09:40.525 "unmap": true, 00:09:40.525 "write_zeroes": true, 00:09:40.525 "flush": true, 00:09:40.525 "reset": true, 00:09:40.525 "compare": false, 00:09:40.525 "compare_and_write": false, 00:09:40.525 "abort": true, 00:09:40.525 "nvme_admin": false, 00:09:40.525 "nvme_io": false 00:09:40.525 }, 00:09:40.525 "memory_domains": [ 00:09:40.526 { 00:09:40.526 "dma_device_id": "system", 00:09:40.526 "dma_device_type": 1 00:09:40.526 }, 00:09:40.526 { 00:09:40.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:40.526 "dma_device_type": 2 00:09:40.526 } 00:09:40.526 ], 00:09:40.526 "driver_specific": {} 00:09:40.526 } 00:09:40.526 ] 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:40.526 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:40.783 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:40.783 "name": "Existed_Raid", 00:09:40.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:40.783 "strip_size_kb": 0, 00:09:40.783 "state": "configuring", 00:09:40.783 "raid_level": "raid1", 00:09:40.783 "superblock": false, 00:09:40.783 "num_base_bdevs": 2, 00:09:40.783 "num_base_bdevs_discovered": 1, 00:09:40.783 "num_base_bdevs_operational": 2, 00:09:40.783 "base_bdevs_list": [ 00:09:40.783 { 00:09:40.783 "name": "BaseBdev1", 00:09:40.783 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:40.783 "is_configured": true, 00:09:40.783 "data_offset": 0, 00:09:40.783 "data_size": 65536 00:09:40.783 }, 00:09:40.783 { 00:09:40.783 "name": "BaseBdev2", 00:09:40.783 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:40.783 "is_configured": false, 00:09:40.783 "data_offset": 0, 00:09:40.783 "data_size": 0 00:09:40.783 } 00:09:40.783 ] 00:09:40.783 }' 00:09:40.783 04:10:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:40.783 04:10:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:41.348 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:41.348 [2024-05-15 04:10:29.329796] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:41.348 [2024-05-15 04:10:29.329867] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a268f0 name Existed_Raid, state configuring 00:09:41.348 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:41.605 [2024-05-15 04:10:29.574485] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:41.605 [2024-05-15 04:10:29.575872] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:41.605 [2024-05-15 04:10:29.575904] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:41.605 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:41.862 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:41.862 "name": "Existed_Raid", 00:09:41.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:41.862 "strip_size_kb": 0, 00:09:41.862 "state": "configuring", 00:09:41.862 "raid_level": "raid1", 00:09:41.862 "superblock": false, 00:09:41.862 "num_base_bdevs": 2, 00:09:41.862 "num_base_bdevs_discovered": 1, 00:09:41.862 "num_base_bdevs_operational": 2, 00:09:41.862 "base_bdevs_list": [ 00:09:41.862 { 00:09:41.862 "name": "BaseBdev1", 00:09:41.862 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:41.862 "is_configured": true, 00:09:41.862 "data_offset": 0, 00:09:41.862 "data_size": 65536 00:09:41.862 }, 00:09:41.862 { 00:09:41.862 "name": "BaseBdev2", 00:09:41.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:41.862 "is_configured": false, 00:09:41.862 "data_offset": 0, 00:09:41.862 "data_size": 0 00:09:41.862 } 00:09:41.862 ] 00:09:41.862 }' 00:09:41.862 04:10:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:41.862 04:10:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:42.427 04:10:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:42.684 [2024-05-15 04:10:30.643264] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:42.684 [2024-05-15 04:10:30.643325] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a276e0 00:09:42.684 [2024-05-15 04:10:30.643335] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:09:42.684 [2024-05-15 04:10:30.643540] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a1df80 00:09:42.684 [2024-05-15 04:10:30.643705] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a276e0 00:09:42.684 [2024-05-15 04:10:30.643722] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a276e0 00:09:42.684 [2024-05-15 04:10:30.643956] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:42.684 BaseBdev2 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:42.684 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:42.942 04:10:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:43.199 [ 00:09:43.199 { 00:09:43.199 "name": "BaseBdev2", 00:09:43.199 "aliases": [ 00:09:43.199 "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb" 00:09:43.199 ], 00:09:43.199 "product_name": "Malloc disk", 00:09:43.199 "block_size": 512, 00:09:43.199 "num_blocks": 65536, 00:09:43.199 "uuid": "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb", 00:09:43.199 "assigned_rate_limits": { 00:09:43.199 "rw_ios_per_sec": 0, 00:09:43.199 "rw_mbytes_per_sec": 0, 00:09:43.199 "r_mbytes_per_sec": 0, 00:09:43.199 "w_mbytes_per_sec": 0 00:09:43.199 }, 00:09:43.199 "claimed": true, 00:09:43.199 "claim_type": "exclusive_write", 00:09:43.199 "zoned": false, 00:09:43.199 "supported_io_types": { 00:09:43.199 "read": true, 00:09:43.199 "write": true, 00:09:43.199 "unmap": true, 00:09:43.199 "write_zeroes": true, 00:09:43.199 "flush": true, 00:09:43.199 "reset": true, 00:09:43.199 "compare": false, 00:09:43.199 "compare_and_write": false, 00:09:43.199 "abort": true, 00:09:43.199 "nvme_admin": false, 00:09:43.199 "nvme_io": false 00:09:43.199 }, 00:09:43.199 "memory_domains": [ 00:09:43.199 { 00:09:43.199 "dma_device_id": "system", 00:09:43.199 "dma_device_type": 1 00:09:43.199 }, 00:09:43.199 { 00:09:43.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:43.199 "dma_device_type": 2 00:09:43.199 } 00:09:43.199 ], 00:09:43.199 "driver_specific": {} 00:09:43.199 } 00:09:43.199 ] 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.199 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:43.457 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:43.457 "name": "Existed_Raid", 00:09:43.457 "uuid": "d20465e0-ec0b-43e5-b401-dab6d7423b8d", 00:09:43.457 "strip_size_kb": 0, 00:09:43.457 "state": "online", 00:09:43.457 "raid_level": "raid1", 00:09:43.457 "superblock": false, 00:09:43.457 "num_base_bdevs": 2, 00:09:43.457 "num_base_bdevs_discovered": 2, 00:09:43.457 "num_base_bdevs_operational": 2, 00:09:43.457 "base_bdevs_list": [ 00:09:43.457 { 00:09:43.457 "name": "BaseBdev1", 00:09:43.457 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:43.457 "is_configured": true, 00:09:43.457 "data_offset": 0, 00:09:43.457 "data_size": 65536 00:09:43.457 }, 00:09:43.457 { 00:09:43.457 "name": "BaseBdev2", 00:09:43.457 "uuid": "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb", 00:09:43.457 "is_configured": true, 00:09:43.457 "data_offset": 0, 00:09:43.457 "data_size": 65536 00:09:43.457 } 00:09:43.457 ] 00:09:43.457 }' 00:09:43.457 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:43.457 04:10:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:44.022 04:10:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:44.280 [2024-05-15 04:10:32.159458] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:44.280 "name": "Existed_Raid", 00:09:44.280 "aliases": [ 00:09:44.280 "d20465e0-ec0b-43e5-b401-dab6d7423b8d" 00:09:44.280 ], 00:09:44.280 "product_name": "Raid Volume", 00:09:44.280 "block_size": 512, 00:09:44.280 "num_blocks": 65536, 00:09:44.280 "uuid": "d20465e0-ec0b-43e5-b401-dab6d7423b8d", 00:09:44.280 "assigned_rate_limits": { 00:09:44.280 "rw_ios_per_sec": 0, 00:09:44.280 "rw_mbytes_per_sec": 0, 00:09:44.280 "r_mbytes_per_sec": 0, 00:09:44.280 "w_mbytes_per_sec": 0 00:09:44.280 }, 00:09:44.280 "claimed": false, 00:09:44.280 "zoned": false, 00:09:44.280 "supported_io_types": { 00:09:44.280 "read": true, 00:09:44.280 "write": true, 00:09:44.280 "unmap": false, 00:09:44.280 "write_zeroes": true, 00:09:44.280 "flush": false, 00:09:44.280 "reset": true, 00:09:44.280 "compare": false, 00:09:44.280 "compare_and_write": false, 00:09:44.280 "abort": false, 00:09:44.280 "nvme_admin": false, 00:09:44.280 "nvme_io": false 00:09:44.280 }, 00:09:44.280 "memory_domains": [ 00:09:44.280 { 00:09:44.280 "dma_device_id": "system", 00:09:44.280 "dma_device_type": 1 00:09:44.280 }, 00:09:44.280 { 00:09:44.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.280 "dma_device_type": 2 00:09:44.280 }, 00:09:44.280 { 00:09:44.280 "dma_device_id": "system", 00:09:44.280 "dma_device_type": 1 00:09:44.280 }, 00:09:44.280 { 00:09:44.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.280 "dma_device_type": 2 00:09:44.280 } 00:09:44.280 ], 00:09:44.280 "driver_specific": { 00:09:44.280 "raid": { 00:09:44.280 "uuid": "d20465e0-ec0b-43e5-b401-dab6d7423b8d", 00:09:44.280 "strip_size_kb": 0, 00:09:44.280 "state": "online", 00:09:44.280 "raid_level": "raid1", 00:09:44.280 "superblock": false, 00:09:44.280 "num_base_bdevs": 2, 00:09:44.280 "num_base_bdevs_discovered": 2, 00:09:44.280 "num_base_bdevs_operational": 2, 00:09:44.280 "base_bdevs_list": [ 00:09:44.280 { 00:09:44.280 "name": "BaseBdev1", 00:09:44.280 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:44.280 "is_configured": true, 00:09:44.280 "data_offset": 0, 00:09:44.280 "data_size": 65536 00:09:44.280 }, 00:09:44.280 { 00:09:44.280 "name": "BaseBdev2", 00:09:44.280 "uuid": "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb", 00:09:44.280 "is_configured": true, 00:09:44.280 "data_offset": 0, 00:09:44.280 "data_size": 65536 00:09:44.280 } 00:09:44.280 ] 00:09:44.280 } 00:09:44.280 } 00:09:44.280 }' 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:09:44.280 BaseBdev2' 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:44.280 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:44.538 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:44.538 "name": "BaseBdev1", 00:09:44.538 "aliases": [ 00:09:44.538 "0ee5a744-f912-4c3c-88cb-0c991051f4d6" 00:09:44.538 ], 00:09:44.538 "product_name": "Malloc disk", 00:09:44.538 "block_size": 512, 00:09:44.538 "num_blocks": 65536, 00:09:44.538 "uuid": "0ee5a744-f912-4c3c-88cb-0c991051f4d6", 00:09:44.538 "assigned_rate_limits": { 00:09:44.538 "rw_ios_per_sec": 0, 00:09:44.538 "rw_mbytes_per_sec": 0, 00:09:44.538 "r_mbytes_per_sec": 0, 00:09:44.538 "w_mbytes_per_sec": 0 00:09:44.538 }, 00:09:44.538 "claimed": true, 00:09:44.538 "claim_type": "exclusive_write", 00:09:44.538 "zoned": false, 00:09:44.538 "supported_io_types": { 00:09:44.538 "read": true, 00:09:44.538 "write": true, 00:09:44.538 "unmap": true, 00:09:44.538 "write_zeroes": true, 00:09:44.538 "flush": true, 00:09:44.538 "reset": true, 00:09:44.538 "compare": false, 00:09:44.538 "compare_and_write": false, 00:09:44.538 "abort": true, 00:09:44.538 "nvme_admin": false, 00:09:44.538 "nvme_io": false 00:09:44.538 }, 00:09:44.538 "memory_domains": [ 00:09:44.538 { 00:09:44.538 "dma_device_id": "system", 00:09:44.538 "dma_device_type": 1 00:09:44.538 }, 00:09:44.538 { 00:09:44.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:44.538 "dma_device_type": 2 00:09:44.538 } 00:09:44.538 ], 00:09:44.538 "driver_specific": {} 00:09:44.538 }' 00:09:44.538 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:44.538 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:44.538 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:44.538 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:44.796 04:10:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:45.054 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:45.054 "name": "BaseBdev2", 00:09:45.054 "aliases": [ 00:09:45.054 "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb" 00:09:45.054 ], 00:09:45.054 "product_name": "Malloc disk", 00:09:45.054 "block_size": 512, 00:09:45.054 "num_blocks": 65536, 00:09:45.054 "uuid": "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb", 00:09:45.054 "assigned_rate_limits": { 00:09:45.054 "rw_ios_per_sec": 0, 00:09:45.054 "rw_mbytes_per_sec": 0, 00:09:45.054 "r_mbytes_per_sec": 0, 00:09:45.054 "w_mbytes_per_sec": 0 00:09:45.054 }, 00:09:45.054 "claimed": true, 00:09:45.054 "claim_type": "exclusive_write", 00:09:45.054 "zoned": false, 00:09:45.054 "supported_io_types": { 00:09:45.054 "read": true, 00:09:45.054 "write": true, 00:09:45.054 "unmap": true, 00:09:45.054 "write_zeroes": true, 00:09:45.054 "flush": true, 00:09:45.054 "reset": true, 00:09:45.054 "compare": false, 00:09:45.054 "compare_and_write": false, 00:09:45.054 "abort": true, 00:09:45.054 "nvme_admin": false, 00:09:45.054 "nvme_io": false 00:09:45.054 }, 00:09:45.054 "memory_domains": [ 00:09:45.054 { 00:09:45.054 "dma_device_id": "system", 00:09:45.054 "dma_device_type": 1 00:09:45.054 }, 00:09:45.054 { 00:09:45.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:45.054 "dma_device_type": 2 00:09:45.054 } 00:09:45.054 ], 00:09:45.054 "driver_specific": {} 00:09:45.054 }' 00:09:45.054 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:45.054 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:45.312 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:45.577 [2024-05-15 04:10:33.543272] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:09:45.577 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:45.578 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:45.578 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:45.578 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:45.578 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:45.578 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:45.836 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:45.836 "name": "Existed_Raid", 00:09:45.836 "uuid": "d20465e0-ec0b-43e5-b401-dab6d7423b8d", 00:09:45.836 "strip_size_kb": 0, 00:09:45.836 "state": "online", 00:09:45.836 "raid_level": "raid1", 00:09:45.836 "superblock": false, 00:09:45.836 "num_base_bdevs": 2, 00:09:45.836 "num_base_bdevs_discovered": 1, 00:09:45.836 "num_base_bdevs_operational": 1, 00:09:45.836 "base_bdevs_list": [ 00:09:45.836 { 00:09:45.836 "name": null, 00:09:45.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:45.836 "is_configured": false, 00:09:45.836 "data_offset": 0, 00:09:45.836 "data_size": 65536 00:09:45.836 }, 00:09:45.836 { 00:09:45.836 "name": "BaseBdev2", 00:09:45.836 "uuid": "ce0e9e63-3705-4a33-80a9-a3852cb5a9eb", 00:09:45.836 "is_configured": true, 00:09:45.836 "data_offset": 0, 00:09:45.836 "data_size": 65536 00:09:45.836 } 00:09:45.836 ] 00:09:45.836 }' 00:09:45.836 04:10:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:45.836 04:10:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:46.401 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:09:46.401 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:46.402 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:46.402 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:09:46.659 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:09:46.659 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:46.659 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:46.917 [2024-05-15 04:10:34.820787] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:46.917 [2024-05-15 04:10:34.820916] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:46.917 [2024-05-15 04:10:34.832629] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:46.917 [2024-05-15 04:10:34.832705] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:46.917 [2024-05-15 04:10:34.832718] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a276e0 name Existed_Raid, state offline 00:09:46.917 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:09:46.917 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:46.917 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:46.917 04:10:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3833263 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3833263 ']' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3833263 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3833263 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3833263' 00:09:47.174 killing process with pid 3833263 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3833263 00:09:47.174 [2024-05-15 04:10:35.119225] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:47.174 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3833263 00:09:47.174 [2024-05-15 04:10:35.120279] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:47.431 04:10:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:09:47.431 00:09:47.431 real 0m10.289s 00:09:47.431 user 0m18.575s 00:09:47.431 sys 0m1.450s 00:09:47.431 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:47.431 04:10:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:47.431 ************************************ 00:09:47.431 END TEST raid_state_function_test 00:09:47.431 ************************************ 00:09:47.431 04:10:35 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:09:47.431 04:10:35 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:09:47.431 04:10:35 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:47.431 04:10:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:47.431 ************************************ 00:09:47.431 START TEST raid_state_function_test_sb 00:09:47.431 ************************************ 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:09:47.689 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3834693 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3834693' 00:09:47.690 Process raid pid: 3834693 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3834693 /var/tmp/spdk-raid.sock 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3834693 ']' 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:47.690 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:47.690 04:10:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:47.690 [2024-05-15 04:10:35.498556] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:47.690 [2024-05-15 04:10:35.498627] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:47.690 [2024-05-15 04:10:35.574454] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:47.690 [2024-05-15 04:10:35.685089] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.948 [2024-05-15 04:10:35.757566] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:47.948 [2024-05-15 04:10:35.757604] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:48.513 04:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:48.513 04:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:09:48.513 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:48.771 [2024-05-15 04:10:36.644757] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:48.771 [2024-05-15 04:10:36.644806] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:48.771 [2024-05-15 04:10:36.644838] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:48.771 [2024-05-15 04:10:36.644853] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:48.771 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:49.029 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:49.029 "name": "Existed_Raid", 00:09:49.029 "uuid": "c7b587e1-f23c-425c-8f90-4ea9efe1cccb", 00:09:49.029 "strip_size_kb": 0, 00:09:49.029 "state": "configuring", 00:09:49.029 "raid_level": "raid1", 00:09:49.029 "superblock": true, 00:09:49.029 "num_base_bdevs": 2, 00:09:49.029 "num_base_bdevs_discovered": 0, 00:09:49.029 "num_base_bdevs_operational": 2, 00:09:49.029 "base_bdevs_list": [ 00:09:49.029 { 00:09:49.029 "name": "BaseBdev1", 00:09:49.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.029 "is_configured": false, 00:09:49.029 "data_offset": 0, 00:09:49.029 "data_size": 0 00:09:49.029 }, 00:09:49.029 { 00:09:49.029 "name": "BaseBdev2", 00:09:49.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.029 "is_configured": false, 00:09:49.029 "data_offset": 0, 00:09:49.029 "data_size": 0 00:09:49.029 } 00:09:49.029 ] 00:09:49.029 }' 00:09:49.029 04:10:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:49.029 04:10:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:49.595 04:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:49.853 [2024-05-15 04:10:37.663371] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:49.853 [2024-05-15 04:10:37.663402] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2f000 name Existed_Raid, state configuring 00:09:49.853 04:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:50.112 [2024-05-15 04:10:37.912056] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:50.112 [2024-05-15 04:10:37.912095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:50.112 [2024-05-15 04:10:37.912119] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:50.112 [2024-05-15 04:10:37.912129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:50.112 04:10:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:50.370 [2024-05-15 04:10:38.161152] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:50.370 BaseBdev1 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:50.370 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:50.628 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:50.886 [ 00:09:50.886 { 00:09:50.886 "name": "BaseBdev1", 00:09:50.886 "aliases": [ 00:09:50.886 "25a97160-698d-44b9-824a-cd9a9f4ad94b" 00:09:50.886 ], 00:09:50.886 "product_name": "Malloc disk", 00:09:50.886 "block_size": 512, 00:09:50.886 "num_blocks": 65536, 00:09:50.886 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:50.886 "assigned_rate_limits": { 00:09:50.887 "rw_ios_per_sec": 0, 00:09:50.887 "rw_mbytes_per_sec": 0, 00:09:50.887 "r_mbytes_per_sec": 0, 00:09:50.887 "w_mbytes_per_sec": 0 00:09:50.887 }, 00:09:50.887 "claimed": true, 00:09:50.887 "claim_type": "exclusive_write", 00:09:50.887 "zoned": false, 00:09:50.887 "supported_io_types": { 00:09:50.887 "read": true, 00:09:50.887 "write": true, 00:09:50.887 "unmap": true, 00:09:50.887 "write_zeroes": true, 00:09:50.887 "flush": true, 00:09:50.887 "reset": true, 00:09:50.887 "compare": false, 00:09:50.887 "compare_and_write": false, 00:09:50.887 "abort": true, 00:09:50.887 "nvme_admin": false, 00:09:50.887 "nvme_io": false 00:09:50.887 }, 00:09:50.887 "memory_domains": [ 00:09:50.887 { 00:09:50.887 "dma_device_id": "system", 00:09:50.887 "dma_device_type": 1 00:09:50.887 }, 00:09:50.887 { 00:09:50.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.887 "dma_device_type": 2 00:09:50.887 } 00:09:50.887 ], 00:09:50.887 "driver_specific": {} 00:09:50.887 } 00:09:50.887 ] 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:50.887 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:51.154 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:51.154 "name": "Existed_Raid", 00:09:51.154 "uuid": "6336ef4b-7d73-4626-ab07-ec7184333ae6", 00:09:51.154 "strip_size_kb": 0, 00:09:51.154 "state": "configuring", 00:09:51.154 "raid_level": "raid1", 00:09:51.154 "superblock": true, 00:09:51.154 "num_base_bdevs": 2, 00:09:51.154 "num_base_bdevs_discovered": 1, 00:09:51.154 "num_base_bdevs_operational": 2, 00:09:51.154 "base_bdevs_list": [ 00:09:51.154 { 00:09:51.154 "name": "BaseBdev1", 00:09:51.154 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:51.154 "is_configured": true, 00:09:51.154 "data_offset": 2048, 00:09:51.155 "data_size": 63488 00:09:51.155 }, 00:09:51.155 { 00:09:51.155 "name": "BaseBdev2", 00:09:51.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:51.155 "is_configured": false, 00:09:51.155 "data_offset": 0, 00:09:51.155 "data_size": 0 00:09:51.155 } 00:09:51.155 ] 00:09:51.155 }' 00:09:51.155 04:10:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:51.155 04:10:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:51.721 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:51.721 [2024-05-15 04:10:39.677188] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:51.721 [2024-05-15 04:10:39.677239] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2e8f0 name Existed_Raid, state configuring 00:09:51.721 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:51.979 [2024-05-15 04:10:39.917859] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:51.979 [2024-05-15 04:10:39.919164] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:51.979 [2024-05-15 04:10:39.919194] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:51.979 04:10:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:52.237 04:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:52.237 "name": "Existed_Raid", 00:09:52.237 "uuid": "dc511287-435c-4eeb-82a7-9eb126df6ac5", 00:09:52.237 "strip_size_kb": 0, 00:09:52.237 "state": "configuring", 00:09:52.237 "raid_level": "raid1", 00:09:52.237 "superblock": true, 00:09:52.237 "num_base_bdevs": 2, 00:09:52.237 "num_base_bdevs_discovered": 1, 00:09:52.237 "num_base_bdevs_operational": 2, 00:09:52.237 "base_bdevs_list": [ 00:09:52.237 { 00:09:52.237 "name": "BaseBdev1", 00:09:52.237 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:52.237 "is_configured": true, 00:09:52.237 "data_offset": 2048, 00:09:52.237 "data_size": 63488 00:09:52.237 }, 00:09:52.237 { 00:09:52.237 "name": "BaseBdev2", 00:09:52.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:52.237 "is_configured": false, 00:09:52.237 "data_offset": 0, 00:09:52.237 "data_size": 0 00:09:52.237 } 00:09:52.237 ] 00:09:52.237 }' 00:09:52.237 04:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:52.237 04:10:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:52.803 04:10:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:53.062 [2024-05-15 04:10:41.002433] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:53.062 [2024-05-15 04:10:41.002668] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xc2f6e0 00:09:53.062 [2024-05-15 04:10:41.002687] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:09:53.062 [2024-05-15 04:10:41.002877] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc30bb0 00:09:53.062 [2024-05-15 04:10:41.003027] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc2f6e0 00:09:53.062 [2024-05-15 04:10:41.003041] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc2f6e0 00:09:53.062 [2024-05-15 04:10:41.003153] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:53.062 BaseBdev2 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:09:53.062 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:53.321 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:53.579 [ 00:09:53.579 { 00:09:53.579 "name": "BaseBdev2", 00:09:53.579 "aliases": [ 00:09:53.579 "bfef46ad-7afc-4d9f-8630-1b4ef809b813" 00:09:53.579 ], 00:09:53.579 "product_name": "Malloc disk", 00:09:53.579 "block_size": 512, 00:09:53.579 "num_blocks": 65536, 00:09:53.579 "uuid": "bfef46ad-7afc-4d9f-8630-1b4ef809b813", 00:09:53.579 "assigned_rate_limits": { 00:09:53.579 "rw_ios_per_sec": 0, 00:09:53.579 "rw_mbytes_per_sec": 0, 00:09:53.579 "r_mbytes_per_sec": 0, 00:09:53.579 "w_mbytes_per_sec": 0 00:09:53.579 }, 00:09:53.579 "claimed": true, 00:09:53.579 "claim_type": "exclusive_write", 00:09:53.579 "zoned": false, 00:09:53.579 "supported_io_types": { 00:09:53.579 "read": true, 00:09:53.579 "write": true, 00:09:53.579 "unmap": true, 00:09:53.579 "write_zeroes": true, 00:09:53.579 "flush": true, 00:09:53.579 "reset": true, 00:09:53.579 "compare": false, 00:09:53.579 "compare_and_write": false, 00:09:53.579 "abort": true, 00:09:53.579 "nvme_admin": false, 00:09:53.579 "nvme_io": false 00:09:53.579 }, 00:09:53.579 "memory_domains": [ 00:09:53.579 { 00:09:53.579 "dma_device_id": "system", 00:09:53.579 "dma_device_type": 1 00:09:53.579 }, 00:09:53.579 { 00:09:53.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:53.579 "dma_device_type": 2 00:09:53.579 } 00:09:53.579 ], 00:09:53.579 "driver_specific": {} 00:09:53.579 } 00:09:53.579 ] 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.579 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:53.837 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:53.837 "name": "Existed_Raid", 00:09:53.837 "uuid": "dc511287-435c-4eeb-82a7-9eb126df6ac5", 00:09:53.837 "strip_size_kb": 0, 00:09:53.837 "state": "online", 00:09:53.837 "raid_level": "raid1", 00:09:53.837 "superblock": true, 00:09:53.837 "num_base_bdevs": 2, 00:09:53.837 "num_base_bdevs_discovered": 2, 00:09:53.837 "num_base_bdevs_operational": 2, 00:09:53.837 "base_bdevs_list": [ 00:09:53.837 { 00:09:53.837 "name": "BaseBdev1", 00:09:53.837 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:53.837 "is_configured": true, 00:09:53.837 "data_offset": 2048, 00:09:53.837 "data_size": 63488 00:09:53.837 }, 00:09:53.837 { 00:09:53.837 "name": "BaseBdev2", 00:09:53.837 "uuid": "bfef46ad-7afc-4d9f-8630-1b4ef809b813", 00:09:53.837 "is_configured": true, 00:09:53.837 "data_offset": 2048, 00:09:53.837 "data_size": 63488 00:09:53.837 } 00:09:53.837 ] 00:09:53.837 }' 00:09:53.837 04:10:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:53.837 04:10:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:54.403 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:09:54.662 [2024-05-15 04:10:42.522626] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:09:54.662 "name": "Existed_Raid", 00:09:54.662 "aliases": [ 00:09:54.662 "dc511287-435c-4eeb-82a7-9eb126df6ac5" 00:09:54.662 ], 00:09:54.662 "product_name": "Raid Volume", 00:09:54.662 "block_size": 512, 00:09:54.662 "num_blocks": 63488, 00:09:54.662 "uuid": "dc511287-435c-4eeb-82a7-9eb126df6ac5", 00:09:54.662 "assigned_rate_limits": { 00:09:54.662 "rw_ios_per_sec": 0, 00:09:54.662 "rw_mbytes_per_sec": 0, 00:09:54.662 "r_mbytes_per_sec": 0, 00:09:54.662 "w_mbytes_per_sec": 0 00:09:54.662 }, 00:09:54.662 "claimed": false, 00:09:54.662 "zoned": false, 00:09:54.662 "supported_io_types": { 00:09:54.662 "read": true, 00:09:54.662 "write": true, 00:09:54.662 "unmap": false, 00:09:54.662 "write_zeroes": true, 00:09:54.662 "flush": false, 00:09:54.662 "reset": true, 00:09:54.662 "compare": false, 00:09:54.662 "compare_and_write": false, 00:09:54.662 "abort": false, 00:09:54.662 "nvme_admin": false, 00:09:54.662 "nvme_io": false 00:09:54.662 }, 00:09:54.662 "memory_domains": [ 00:09:54.662 { 00:09:54.662 "dma_device_id": "system", 00:09:54.662 "dma_device_type": 1 00:09:54.662 }, 00:09:54.662 { 00:09:54.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:54.662 "dma_device_type": 2 00:09:54.662 }, 00:09:54.662 { 00:09:54.662 "dma_device_id": "system", 00:09:54.662 "dma_device_type": 1 00:09:54.662 }, 00:09:54.662 { 00:09:54.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:54.662 "dma_device_type": 2 00:09:54.662 } 00:09:54.662 ], 00:09:54.662 "driver_specific": { 00:09:54.662 "raid": { 00:09:54.662 "uuid": "dc511287-435c-4eeb-82a7-9eb126df6ac5", 00:09:54.662 "strip_size_kb": 0, 00:09:54.662 "state": "online", 00:09:54.662 "raid_level": "raid1", 00:09:54.662 "superblock": true, 00:09:54.662 "num_base_bdevs": 2, 00:09:54.662 "num_base_bdevs_discovered": 2, 00:09:54.662 "num_base_bdevs_operational": 2, 00:09:54.662 "base_bdevs_list": [ 00:09:54.662 { 00:09:54.662 "name": "BaseBdev1", 00:09:54.662 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:54.662 "is_configured": true, 00:09:54.662 "data_offset": 2048, 00:09:54.662 "data_size": 63488 00:09:54.662 }, 00:09:54.662 { 00:09:54.662 "name": "BaseBdev2", 00:09:54.662 "uuid": "bfef46ad-7afc-4d9f-8630-1b4ef809b813", 00:09:54.662 "is_configured": true, 00:09:54.662 "data_offset": 2048, 00:09:54.662 "data_size": 63488 00:09:54.662 } 00:09:54.662 ] 00:09:54.662 } 00:09:54.662 } 00:09:54.662 }' 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:09:54.662 BaseBdev2' 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:54.662 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:54.921 "name": "BaseBdev1", 00:09:54.921 "aliases": [ 00:09:54.921 "25a97160-698d-44b9-824a-cd9a9f4ad94b" 00:09:54.921 ], 00:09:54.921 "product_name": "Malloc disk", 00:09:54.921 "block_size": 512, 00:09:54.921 "num_blocks": 65536, 00:09:54.921 "uuid": "25a97160-698d-44b9-824a-cd9a9f4ad94b", 00:09:54.921 "assigned_rate_limits": { 00:09:54.921 "rw_ios_per_sec": 0, 00:09:54.921 "rw_mbytes_per_sec": 0, 00:09:54.921 "r_mbytes_per_sec": 0, 00:09:54.921 "w_mbytes_per_sec": 0 00:09:54.921 }, 00:09:54.921 "claimed": true, 00:09:54.921 "claim_type": "exclusive_write", 00:09:54.921 "zoned": false, 00:09:54.921 "supported_io_types": { 00:09:54.921 "read": true, 00:09:54.921 "write": true, 00:09:54.921 "unmap": true, 00:09:54.921 "write_zeroes": true, 00:09:54.921 "flush": true, 00:09:54.921 "reset": true, 00:09:54.921 "compare": false, 00:09:54.921 "compare_and_write": false, 00:09:54.921 "abort": true, 00:09:54.921 "nvme_admin": false, 00:09:54.921 "nvme_io": false 00:09:54.921 }, 00:09:54.921 "memory_domains": [ 00:09:54.921 { 00:09:54.921 "dma_device_id": "system", 00:09:54.921 "dma_device_type": 1 00:09:54.921 }, 00:09:54.921 { 00:09:54.921 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:54.921 "dma_device_type": 2 00:09:54.921 } 00:09:54.921 ], 00:09:54.921 "driver_specific": {} 00:09:54.921 }' 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:54.921 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:55.180 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:55.180 04:10:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:55.180 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:09:55.438 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:09:55.438 "name": "BaseBdev2", 00:09:55.438 "aliases": [ 00:09:55.438 "bfef46ad-7afc-4d9f-8630-1b4ef809b813" 00:09:55.438 ], 00:09:55.438 "product_name": "Malloc disk", 00:09:55.438 "block_size": 512, 00:09:55.438 "num_blocks": 65536, 00:09:55.438 "uuid": "bfef46ad-7afc-4d9f-8630-1b4ef809b813", 00:09:55.438 "assigned_rate_limits": { 00:09:55.438 "rw_ios_per_sec": 0, 00:09:55.438 "rw_mbytes_per_sec": 0, 00:09:55.438 "r_mbytes_per_sec": 0, 00:09:55.438 "w_mbytes_per_sec": 0 00:09:55.438 }, 00:09:55.438 "claimed": true, 00:09:55.438 "claim_type": "exclusive_write", 00:09:55.438 "zoned": false, 00:09:55.438 "supported_io_types": { 00:09:55.438 "read": true, 00:09:55.438 "write": true, 00:09:55.438 "unmap": true, 00:09:55.438 "write_zeroes": true, 00:09:55.438 "flush": true, 00:09:55.438 "reset": true, 00:09:55.438 "compare": false, 00:09:55.438 "compare_and_write": false, 00:09:55.438 "abort": true, 00:09:55.438 "nvme_admin": false, 00:09:55.438 "nvme_io": false 00:09:55.438 }, 00:09:55.438 "memory_domains": [ 00:09:55.438 { 00:09:55.438 "dma_device_id": "system", 00:09:55.438 "dma_device_type": 1 00:09:55.438 }, 00:09:55.438 { 00:09:55.438 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:55.438 "dma_device_type": 2 00:09:55.438 } 00:09:55.438 ], 00:09:55.438 "driver_specific": {} 00:09:55.438 }' 00:09:55.438 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:55.438 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:09:55.438 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:09:55.438 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:09:55.695 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:55.953 [2024-05-15 04:10:43.878126] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:09:55.953 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:09:55.954 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:09:55.954 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:09:55.954 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:55.954 04:10:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:56.212 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:09:56.212 "name": "Existed_Raid", 00:09:56.212 "uuid": "dc511287-435c-4eeb-82a7-9eb126df6ac5", 00:09:56.212 "strip_size_kb": 0, 00:09:56.212 "state": "online", 00:09:56.212 "raid_level": "raid1", 00:09:56.212 "superblock": true, 00:09:56.212 "num_base_bdevs": 2, 00:09:56.212 "num_base_bdevs_discovered": 1, 00:09:56.212 "num_base_bdevs_operational": 1, 00:09:56.212 "base_bdevs_list": [ 00:09:56.212 { 00:09:56.212 "name": null, 00:09:56.212 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:56.212 "is_configured": false, 00:09:56.212 "data_offset": 2048, 00:09:56.212 "data_size": 63488 00:09:56.212 }, 00:09:56.212 { 00:09:56.212 "name": "BaseBdev2", 00:09:56.212 "uuid": "bfef46ad-7afc-4d9f-8630-1b4ef809b813", 00:09:56.212 "is_configured": true, 00:09:56.212 "data_offset": 2048, 00:09:56.212 "data_size": 63488 00:09:56.212 } 00:09:56.212 ] 00:09:56.212 }' 00:09:56.212 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:09:56.212 04:10:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:56.778 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:09:56.778 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:56.778 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:56.778 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:09:57.036 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:09:57.036 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:57.036 04:10:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:57.294 [2024-05-15 04:10:45.179338] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:57.294 [2024-05-15 04:10:45.179428] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:57.294 [2024-05-15 04:10:45.190979] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:57.294 [2024-05-15 04:10:45.191049] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:57.294 [2024-05-15 04:10:45.191062] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc2f6e0 name Existed_Raid, state offline 00:09:57.294 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:09:57.294 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:09:57.294 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:57.294 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3834693 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3834693 ']' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3834693 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3834693 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3834693' 00:09:57.552 killing process with pid 3834693 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3834693 00:09:57.552 [2024-05-15 04:10:45.482877] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:57.552 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3834693 00:09:57.552 [2024-05-15 04:10:45.483999] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:57.810 04:10:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:09:57.810 00:09:57.810 real 0m10.309s 00:09:57.810 user 0m18.583s 00:09:57.810 sys 0m1.485s 00:09:57.810 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:57.810 04:10:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:57.810 ************************************ 00:09:57.810 END TEST raid_state_function_test_sb 00:09:57.810 ************************************ 00:09:57.810 04:10:45 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:09:57.810 04:10:45 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:09:57.810 04:10:45 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:09:57.810 04:10:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:57.810 ************************************ 00:09:57.810 START TEST raid_superblock_test 00:09:57.810 ************************************ 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3836123 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3836123 /var/tmp/spdk-raid.sock 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3836123 ']' 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:57.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:09:57.810 04:10:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:58.069 [2024-05-15 04:10:45.861438] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:09:58.069 [2024-05-15 04:10:45.861513] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3836123 ] 00:09:58.069 [2024-05-15 04:10:45.942386] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:58.069 [2024-05-15 04:10:46.051336] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.326 [2024-05-15 04:10:46.123094] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:58.326 [2024-05-15 04:10:46.123142] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:58.891 04:10:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:59.149 malloc1 00:09:59.149 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:59.407 [2024-05-15 04:10:47.371398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:59.407 [2024-05-15 04:10:47.371443] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:59.407 [2024-05-15 04:10:47.371470] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8dec20 00:09:59.407 [2024-05-15 04:10:47.371487] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:59.407 [2024-05-15 04:10:47.373030] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:59.407 [2024-05-15 04:10:47.373059] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:59.407 pt1 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:59.407 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:59.665 malloc2 00:09:59.665 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:59.923 [2024-05-15 04:10:47.875488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:59.923 [2024-05-15 04:10:47.875562] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:59.923 [2024-05-15 04:10:47.875589] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d6c00 00:09:59.923 [2024-05-15 04:10:47.875605] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:59.923 [2024-05-15 04:10:47.877576] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:59.923 [2024-05-15 04:10:47.877606] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:59.923 pt2 00:09:59.923 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:09:59.923 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:09:59.923 04:10:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:10:00.181 [2024-05-15 04:10:48.116160] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:00.181 [2024-05-15 04:10:48.117413] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:00.181 [2024-05-15 04:10:48.117582] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x8d7230 00:10:00.181 [2024-05-15 04:10:48.117597] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:00.181 [2024-05-15 04:10:48.117800] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f5b10 00:10:00.181 [2024-05-15 04:10:48.117995] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8d7230 00:10:00.181 [2024-05-15 04:10:48.118009] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x8d7230 00:10:00.181 [2024-05-15 04:10:48.118149] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.181 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:00.439 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:00.439 "name": "raid_bdev1", 00:10:00.439 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:00.439 "strip_size_kb": 0, 00:10:00.439 "state": "online", 00:10:00.439 "raid_level": "raid1", 00:10:00.439 "superblock": true, 00:10:00.439 "num_base_bdevs": 2, 00:10:00.439 "num_base_bdevs_discovered": 2, 00:10:00.439 "num_base_bdevs_operational": 2, 00:10:00.439 "base_bdevs_list": [ 00:10:00.439 { 00:10:00.439 "name": "pt1", 00:10:00.439 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:00.439 "is_configured": true, 00:10:00.439 "data_offset": 2048, 00:10:00.439 "data_size": 63488 00:10:00.439 }, 00:10:00.439 { 00:10:00.439 "name": "pt2", 00:10:00.439 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:00.439 "is_configured": true, 00:10:00.439 "data_offset": 2048, 00:10:00.439 "data_size": 63488 00:10:00.439 } 00:10:00.439 ] 00:10:00.439 }' 00:10:00.439 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:00.439 04:10:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:01.004 04:10:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:01.261 [2024-05-15 04:10:49.199254] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:01.261 "name": "raid_bdev1", 00:10:01.261 "aliases": [ 00:10:01.261 "3f11934a-cf1e-495a-9fd0-686d0e0d1a88" 00:10:01.261 ], 00:10:01.261 "product_name": "Raid Volume", 00:10:01.261 "block_size": 512, 00:10:01.261 "num_blocks": 63488, 00:10:01.261 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:01.261 "assigned_rate_limits": { 00:10:01.261 "rw_ios_per_sec": 0, 00:10:01.261 "rw_mbytes_per_sec": 0, 00:10:01.261 "r_mbytes_per_sec": 0, 00:10:01.261 "w_mbytes_per_sec": 0 00:10:01.261 }, 00:10:01.261 "claimed": false, 00:10:01.261 "zoned": false, 00:10:01.261 "supported_io_types": { 00:10:01.261 "read": true, 00:10:01.261 "write": true, 00:10:01.261 "unmap": false, 00:10:01.261 "write_zeroes": true, 00:10:01.261 "flush": false, 00:10:01.261 "reset": true, 00:10:01.261 "compare": false, 00:10:01.261 "compare_and_write": false, 00:10:01.261 "abort": false, 00:10:01.261 "nvme_admin": false, 00:10:01.261 "nvme_io": false 00:10:01.261 }, 00:10:01.261 "memory_domains": [ 00:10:01.261 { 00:10:01.261 "dma_device_id": "system", 00:10:01.261 "dma_device_type": 1 00:10:01.261 }, 00:10:01.261 { 00:10:01.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.261 "dma_device_type": 2 00:10:01.261 }, 00:10:01.261 { 00:10:01.261 "dma_device_id": "system", 00:10:01.261 "dma_device_type": 1 00:10:01.261 }, 00:10:01.261 { 00:10:01.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.261 "dma_device_type": 2 00:10:01.261 } 00:10:01.261 ], 00:10:01.261 "driver_specific": { 00:10:01.261 "raid": { 00:10:01.261 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:01.261 "strip_size_kb": 0, 00:10:01.261 "state": "online", 00:10:01.261 "raid_level": "raid1", 00:10:01.261 "superblock": true, 00:10:01.261 "num_base_bdevs": 2, 00:10:01.261 "num_base_bdevs_discovered": 2, 00:10:01.261 "num_base_bdevs_operational": 2, 00:10:01.261 "base_bdevs_list": [ 00:10:01.261 { 00:10:01.261 "name": "pt1", 00:10:01.261 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:01.261 "is_configured": true, 00:10:01.261 "data_offset": 2048, 00:10:01.261 "data_size": 63488 00:10:01.261 }, 00:10:01.261 { 00:10:01.261 "name": "pt2", 00:10:01.261 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:01.261 "is_configured": true, 00:10:01.261 "data_offset": 2048, 00:10:01.261 "data_size": 63488 00:10:01.261 } 00:10:01.261 ] 00:10:01.261 } 00:10:01.261 } 00:10:01.261 }' 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:01.261 pt2' 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:01.261 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:01.518 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:01.518 "name": "pt1", 00:10:01.518 "aliases": [ 00:10:01.518 "991bdb35-2f74-5d9e-acab-087b63876004" 00:10:01.518 ], 00:10:01.518 "product_name": "passthru", 00:10:01.518 "block_size": 512, 00:10:01.518 "num_blocks": 65536, 00:10:01.518 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:01.518 "assigned_rate_limits": { 00:10:01.518 "rw_ios_per_sec": 0, 00:10:01.518 "rw_mbytes_per_sec": 0, 00:10:01.518 "r_mbytes_per_sec": 0, 00:10:01.518 "w_mbytes_per_sec": 0 00:10:01.518 }, 00:10:01.518 "claimed": true, 00:10:01.518 "claim_type": "exclusive_write", 00:10:01.518 "zoned": false, 00:10:01.518 "supported_io_types": { 00:10:01.518 "read": true, 00:10:01.518 "write": true, 00:10:01.518 "unmap": true, 00:10:01.518 "write_zeroes": true, 00:10:01.518 "flush": true, 00:10:01.518 "reset": true, 00:10:01.518 "compare": false, 00:10:01.518 "compare_and_write": false, 00:10:01.518 "abort": true, 00:10:01.518 "nvme_admin": false, 00:10:01.518 "nvme_io": false 00:10:01.518 }, 00:10:01.518 "memory_domains": [ 00:10:01.518 { 00:10:01.518 "dma_device_id": "system", 00:10:01.518 "dma_device_type": 1 00:10:01.518 }, 00:10:01.518 { 00:10:01.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:01.518 "dma_device_type": 2 00:10:01.518 } 00:10:01.518 ], 00:10:01.518 "driver_specific": { 00:10:01.518 "passthru": { 00:10:01.518 "name": "pt1", 00:10:01.518 "base_bdev_name": "malloc1" 00:10:01.518 } 00:10:01.518 } 00:10:01.518 }' 00:10:01.518 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:01.783 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:02.041 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:02.041 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:02.041 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:02.041 04:10:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:02.041 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:02.041 "name": "pt2", 00:10:02.041 "aliases": [ 00:10:02.041 "c956e420-845e-5d32-91c3-22174e4ee58c" 00:10:02.041 ], 00:10:02.041 "product_name": "passthru", 00:10:02.041 "block_size": 512, 00:10:02.041 "num_blocks": 65536, 00:10:02.041 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:02.041 "assigned_rate_limits": { 00:10:02.041 "rw_ios_per_sec": 0, 00:10:02.041 "rw_mbytes_per_sec": 0, 00:10:02.041 "r_mbytes_per_sec": 0, 00:10:02.041 "w_mbytes_per_sec": 0 00:10:02.041 }, 00:10:02.041 "claimed": true, 00:10:02.041 "claim_type": "exclusive_write", 00:10:02.041 "zoned": false, 00:10:02.041 "supported_io_types": { 00:10:02.041 "read": true, 00:10:02.041 "write": true, 00:10:02.041 "unmap": true, 00:10:02.041 "write_zeroes": true, 00:10:02.041 "flush": true, 00:10:02.041 "reset": true, 00:10:02.041 "compare": false, 00:10:02.041 "compare_and_write": false, 00:10:02.041 "abort": true, 00:10:02.041 "nvme_admin": false, 00:10:02.041 "nvme_io": false 00:10:02.041 }, 00:10:02.041 "memory_domains": [ 00:10:02.041 { 00:10:02.041 "dma_device_id": "system", 00:10:02.041 "dma_device_type": 1 00:10:02.041 }, 00:10:02.041 { 00:10:02.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:02.041 "dma_device_type": 2 00:10:02.041 } 00:10:02.041 ], 00:10:02.041 "driver_specific": { 00:10:02.041 "passthru": { 00:10:02.041 "name": "pt2", 00:10:02.041 "base_bdev_name": "malloc2" 00:10:02.041 } 00:10:02.041 } 00:10:02.041 }' 00:10:02.041 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:02.297 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:02.553 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:02.553 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:02.553 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:10:02.809 [2024-05-15 04:10:50.591063] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:02.809 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=3f11934a-cf1e-495a-9fd0-686d0e0d1a88 00:10:02.809 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 3f11934a-cf1e-495a-9fd0-686d0e0d1a88 ']' 00:10:02.809 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:03.066 [2024-05-15 04:10:50.835480] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:03.066 [2024-05-15 04:10:50.835513] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:03.066 [2024-05-15 04:10:50.835594] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:03.066 [2024-05-15 04:10:50.835664] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:03.066 [2024-05-15 04:10:50.835678] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8d7230 name raid_bdev1, state offline 00:10:03.066 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.066 04:10:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:10:03.324 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:03.581 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:03.581 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:03.839 04:10:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:10:04.097 [2024-05-15 04:10:52.102848] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:04.097 [2024-05-15 04:10:52.104274] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:04.097 [2024-05-15 04:10:52.104343] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:04.097 [2024-05-15 04:10:52.104408] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:04.097 [2024-05-15 04:10:52.104437] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:04.097 [2024-05-15 04:10:52.104450] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dfef0 name raid_bdev1, state configuring 00:10:04.097 request: 00:10:04.097 { 00:10:04.097 "name": "raid_bdev1", 00:10:04.097 "raid_level": "raid1", 00:10:04.097 "base_bdevs": [ 00:10:04.097 "malloc1", 00:10:04.097 "malloc2" 00:10:04.097 ], 00:10:04.097 "superblock": false, 00:10:04.097 "method": "bdev_raid_create", 00:10:04.097 "req_id": 1 00:10:04.097 } 00:10:04.097 Got JSON-RPC error response 00:10:04.097 response: 00:10:04.097 { 00:10:04.097 "code": -17, 00:10:04.097 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:04.097 } 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:10:04.355 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:04.613 [2024-05-15 04:10:52.584085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:04.613 [2024-05-15 04:10:52.584140] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:04.613 [2024-05-15 04:10:52.584167] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d7920 00:10:04.613 [2024-05-15 04:10:52.584194] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:04.613 [2024-05-15 04:10:52.585890] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:04.613 [2024-05-15 04:10:52.585920] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:04.613 [2024-05-15 04:10:52.586002] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:04.613 [2024-05-15 04:10:52.586043] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:04.613 pt1 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:04.613 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:05.180 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:05.180 "name": "raid_bdev1", 00:10:05.180 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:05.180 "strip_size_kb": 0, 00:10:05.180 "state": "configuring", 00:10:05.180 "raid_level": "raid1", 00:10:05.180 "superblock": true, 00:10:05.180 "num_base_bdevs": 2, 00:10:05.180 "num_base_bdevs_discovered": 1, 00:10:05.180 "num_base_bdevs_operational": 2, 00:10:05.180 "base_bdevs_list": [ 00:10:05.180 { 00:10:05.180 "name": "pt1", 00:10:05.180 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:05.180 "is_configured": true, 00:10:05.180 "data_offset": 2048, 00:10:05.180 "data_size": 63488 00:10:05.180 }, 00:10:05.180 { 00:10:05.180 "name": null, 00:10:05.180 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:05.180 "is_configured": false, 00:10:05.180 "data_offset": 2048, 00:10:05.180 "data_size": 63488 00:10:05.180 } 00:10:05.180 ] 00:10:05.180 }' 00:10:05.180 04:10:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:05.180 04:10:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:05.438 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:10:05.438 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:10:05.438 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:05.438 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:05.694 [2024-05-15 04:10:53.662969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:05.694 [2024-05-15 04:10:53.663039] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:05.694 [2024-05-15 04:10:53.663066] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d6e30 00:10:05.694 [2024-05-15 04:10:53.663083] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:05.694 [2024-05-15 04:10:53.663488] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:05.694 [2024-05-15 04:10:53.663516] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:05.694 [2024-05-15 04:10:53.663602] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:05.694 [2024-05-15 04:10:53.663633] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:05.694 [2024-05-15 04:10:53.663758] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x8dc6c0 00:10:05.694 [2024-05-15 04:10:53.663775] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:05.694 [2024-05-15 04:10:53.663953] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8f5b10 00:10:05.694 [2024-05-15 04:10:53.664113] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8dc6c0 00:10:05.694 [2024-05-15 04:10:53.664129] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x8dc6c0 00:10:05.694 [2024-05-15 04:10:53.664246] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:05.694 pt2 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:05.694 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:05.951 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:05.951 "name": "raid_bdev1", 00:10:05.951 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:05.951 "strip_size_kb": 0, 00:10:05.951 "state": "online", 00:10:05.951 "raid_level": "raid1", 00:10:05.951 "superblock": true, 00:10:05.951 "num_base_bdevs": 2, 00:10:05.951 "num_base_bdevs_discovered": 2, 00:10:05.951 "num_base_bdevs_operational": 2, 00:10:05.951 "base_bdevs_list": [ 00:10:05.951 { 00:10:05.951 "name": "pt1", 00:10:05.951 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:05.951 "is_configured": true, 00:10:05.951 "data_offset": 2048, 00:10:05.951 "data_size": 63488 00:10:05.951 }, 00:10:05.951 { 00:10:05.951 "name": "pt2", 00:10:05.951 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:05.951 "is_configured": true, 00:10:05.951 "data_offset": 2048, 00:10:05.951 "data_size": 63488 00:10:05.951 } 00:10:05.951 ] 00:10:05.951 }' 00:10:05.951 04:10:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:05.951 04:10:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:06.517 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:06.774 [2024-05-15 04:10:54.750134] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:06.774 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:06.774 "name": "raid_bdev1", 00:10:06.774 "aliases": [ 00:10:06.774 "3f11934a-cf1e-495a-9fd0-686d0e0d1a88" 00:10:06.774 ], 00:10:06.774 "product_name": "Raid Volume", 00:10:06.774 "block_size": 512, 00:10:06.774 "num_blocks": 63488, 00:10:06.774 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:06.774 "assigned_rate_limits": { 00:10:06.774 "rw_ios_per_sec": 0, 00:10:06.774 "rw_mbytes_per_sec": 0, 00:10:06.774 "r_mbytes_per_sec": 0, 00:10:06.774 "w_mbytes_per_sec": 0 00:10:06.774 }, 00:10:06.774 "claimed": false, 00:10:06.774 "zoned": false, 00:10:06.774 "supported_io_types": { 00:10:06.774 "read": true, 00:10:06.774 "write": true, 00:10:06.774 "unmap": false, 00:10:06.774 "write_zeroes": true, 00:10:06.774 "flush": false, 00:10:06.774 "reset": true, 00:10:06.774 "compare": false, 00:10:06.774 "compare_and_write": false, 00:10:06.774 "abort": false, 00:10:06.774 "nvme_admin": false, 00:10:06.774 "nvme_io": false 00:10:06.774 }, 00:10:06.774 "memory_domains": [ 00:10:06.774 { 00:10:06.774 "dma_device_id": "system", 00:10:06.774 "dma_device_type": 1 00:10:06.774 }, 00:10:06.774 { 00:10:06.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.774 "dma_device_type": 2 00:10:06.774 }, 00:10:06.774 { 00:10:06.774 "dma_device_id": "system", 00:10:06.774 "dma_device_type": 1 00:10:06.774 }, 00:10:06.774 { 00:10:06.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:06.774 "dma_device_type": 2 00:10:06.774 } 00:10:06.774 ], 00:10:06.774 "driver_specific": { 00:10:06.774 "raid": { 00:10:06.774 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:06.774 "strip_size_kb": 0, 00:10:06.774 "state": "online", 00:10:06.774 "raid_level": "raid1", 00:10:06.774 "superblock": true, 00:10:06.774 "num_base_bdevs": 2, 00:10:06.774 "num_base_bdevs_discovered": 2, 00:10:06.774 "num_base_bdevs_operational": 2, 00:10:06.774 "base_bdevs_list": [ 00:10:06.774 { 00:10:06.774 "name": "pt1", 00:10:06.774 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:06.774 "is_configured": true, 00:10:06.774 "data_offset": 2048, 00:10:06.774 "data_size": 63488 00:10:06.774 }, 00:10:06.774 { 00:10:06.774 "name": "pt2", 00:10:06.774 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:06.774 "is_configured": true, 00:10:06.774 "data_offset": 2048, 00:10:06.774 "data_size": 63488 00:10:06.774 } 00:10:06.774 ] 00:10:06.774 } 00:10:06.774 } 00:10:06.774 }' 00:10:06.774 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:07.032 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:10:07.032 pt2' 00:10:07.032 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:07.032 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:07.032 04:10:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:07.290 "name": "pt1", 00:10:07.290 "aliases": [ 00:10:07.290 "991bdb35-2f74-5d9e-acab-087b63876004" 00:10:07.290 ], 00:10:07.290 "product_name": "passthru", 00:10:07.290 "block_size": 512, 00:10:07.290 "num_blocks": 65536, 00:10:07.290 "uuid": "991bdb35-2f74-5d9e-acab-087b63876004", 00:10:07.290 "assigned_rate_limits": { 00:10:07.290 "rw_ios_per_sec": 0, 00:10:07.290 "rw_mbytes_per_sec": 0, 00:10:07.290 "r_mbytes_per_sec": 0, 00:10:07.290 "w_mbytes_per_sec": 0 00:10:07.290 }, 00:10:07.290 "claimed": true, 00:10:07.290 "claim_type": "exclusive_write", 00:10:07.290 "zoned": false, 00:10:07.290 "supported_io_types": { 00:10:07.290 "read": true, 00:10:07.290 "write": true, 00:10:07.290 "unmap": true, 00:10:07.290 "write_zeroes": true, 00:10:07.290 "flush": true, 00:10:07.290 "reset": true, 00:10:07.290 "compare": false, 00:10:07.290 "compare_and_write": false, 00:10:07.290 "abort": true, 00:10:07.290 "nvme_admin": false, 00:10:07.290 "nvme_io": false 00:10:07.290 }, 00:10:07.290 "memory_domains": [ 00:10:07.290 { 00:10:07.290 "dma_device_id": "system", 00:10:07.290 "dma_device_type": 1 00:10:07.290 }, 00:10:07.290 { 00:10:07.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:07.290 "dma_device_type": 2 00:10:07.290 } 00:10:07.290 ], 00:10:07.290 "driver_specific": { 00:10:07.290 "passthru": { 00:10:07.290 "name": "pt1", 00:10:07.290 "base_bdev_name": "malloc1" 00:10:07.290 } 00:10:07.290 } 00:10:07.290 }' 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:07.290 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:07.548 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:07.548 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:07.548 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:07.548 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:07.548 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:07.806 "name": "pt2", 00:10:07.806 "aliases": [ 00:10:07.806 "c956e420-845e-5d32-91c3-22174e4ee58c" 00:10:07.806 ], 00:10:07.806 "product_name": "passthru", 00:10:07.806 "block_size": 512, 00:10:07.806 "num_blocks": 65536, 00:10:07.806 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:07.806 "assigned_rate_limits": { 00:10:07.806 "rw_ios_per_sec": 0, 00:10:07.806 "rw_mbytes_per_sec": 0, 00:10:07.806 "r_mbytes_per_sec": 0, 00:10:07.806 "w_mbytes_per_sec": 0 00:10:07.806 }, 00:10:07.806 "claimed": true, 00:10:07.806 "claim_type": "exclusive_write", 00:10:07.806 "zoned": false, 00:10:07.806 "supported_io_types": { 00:10:07.806 "read": true, 00:10:07.806 "write": true, 00:10:07.806 "unmap": true, 00:10:07.806 "write_zeroes": true, 00:10:07.806 "flush": true, 00:10:07.806 "reset": true, 00:10:07.806 "compare": false, 00:10:07.806 "compare_and_write": false, 00:10:07.806 "abort": true, 00:10:07.806 "nvme_admin": false, 00:10:07.806 "nvme_io": false 00:10:07.806 }, 00:10:07.806 "memory_domains": [ 00:10:07.806 { 00:10:07.806 "dma_device_id": "system", 00:10:07.806 "dma_device_type": 1 00:10:07.806 }, 00:10:07.806 { 00:10:07.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:07.806 "dma_device_type": 2 00:10:07.806 } 00:10:07.806 ], 00:10:07.806 "driver_specific": { 00:10:07.806 "passthru": { 00:10:07.806 "name": "pt2", 00:10:07.806 "base_bdev_name": "malloc2" 00:10:07.806 } 00:10:07.806 } 00:10:07.806 }' 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:07.806 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:08.064 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:08.064 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:08.064 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:08.064 04:10:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:10:08.329 [2024-05-15 04:10:56.121806] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:08.329 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 3f11934a-cf1e-495a-9fd0-686d0e0d1a88 '!=' 3f11934a-cf1e-495a-9fd0-686d0e0d1a88 ']' 00:10:08.329 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:10:08.329 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:08.329 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:10:08.329 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:08.639 [2024-05-15 04:10:56.366321] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.639 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:08.897 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:08.897 "name": "raid_bdev1", 00:10:08.897 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:08.897 "strip_size_kb": 0, 00:10:08.898 "state": "online", 00:10:08.898 "raid_level": "raid1", 00:10:08.898 "superblock": true, 00:10:08.898 "num_base_bdevs": 2, 00:10:08.898 "num_base_bdevs_discovered": 1, 00:10:08.898 "num_base_bdevs_operational": 1, 00:10:08.898 "base_bdevs_list": [ 00:10:08.898 { 00:10:08.898 "name": null, 00:10:08.898 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:08.898 "is_configured": false, 00:10:08.898 "data_offset": 2048, 00:10:08.898 "data_size": 63488 00:10:08.898 }, 00:10:08.898 { 00:10:08.898 "name": "pt2", 00:10:08.898 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:08.898 "is_configured": true, 00:10:08.898 "data_offset": 2048, 00:10:08.898 "data_size": 63488 00:10:08.898 } 00:10:08.898 ] 00:10:08.898 }' 00:10:08.898 04:10:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:08.898 04:10:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:09.464 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:09.464 [2024-05-15 04:10:57.433112] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:09.464 [2024-05-15 04:10:57.433139] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:09.464 [2024-05-15 04:10:57.433215] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:09.464 [2024-05-15 04:10:57.433275] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:09.464 [2024-05-15 04:10:57.433292] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dc6c0 name raid_bdev1, state offline 00:10:09.464 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:09.464 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:10:09.722 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:10:09.722 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:10:09.722 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:10:09.722 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:10:09.722 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=1 00:10:09.980 04:10:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:10.239 [2024-05-15 04:10:58.199139] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:10.239 [2024-05-15 04:10:58.199198] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:10.239 [2024-05-15 04:10:58.199226] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d8650 00:10:10.239 [2024-05-15 04:10:58.199243] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:10.239 [2024-05-15 04:10:58.201062] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:10.239 [2024-05-15 04:10:58.201093] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:10.239 [2024-05-15 04:10:58.201191] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:10.239 [2024-05-15 04:10:58.201237] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:10.239 [2024-05-15 04:10:58.201365] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x8dbd80 00:10:10.239 [2024-05-15 04:10:58.201382] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:10.239 [2024-05-15 04:10:58.201554] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8dccf0 00:10:10.239 [2024-05-15 04:10:58.201712] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8dbd80 00:10:10.239 [2024-05-15 04:10:58.201729] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x8dbd80 00:10:10.239 [2024-05-15 04:10:58.201855] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:10.239 pt2 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:10.239 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:10.498 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:10.498 "name": "raid_bdev1", 00:10:10.498 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:10.498 "strip_size_kb": 0, 00:10:10.498 "state": "online", 00:10:10.498 "raid_level": "raid1", 00:10:10.498 "superblock": true, 00:10:10.498 "num_base_bdevs": 2, 00:10:10.498 "num_base_bdevs_discovered": 1, 00:10:10.498 "num_base_bdevs_operational": 1, 00:10:10.498 "base_bdevs_list": [ 00:10:10.498 { 00:10:10.498 "name": null, 00:10:10.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:10.498 "is_configured": false, 00:10:10.498 "data_offset": 2048, 00:10:10.498 "data_size": 63488 00:10:10.498 }, 00:10:10.498 { 00:10:10.498 "name": "pt2", 00:10:10.498 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:10.498 "is_configured": true, 00:10:10.498 "data_offset": 2048, 00:10:10.498 "data_size": 63488 00:10:10.498 } 00:10:10.498 ] 00:10:10.498 }' 00:10:10.498 04:10:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:10.498 04:10:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:11.063 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:11.321 [2024-05-15 04:10:59.241865] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:11.321 [2024-05-15 04:10:59.241895] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:11.321 [2024-05-15 04:10:59.241970] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:11.321 [2024-05-15 04:10:59.242033] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:11.321 [2024-05-15 04:10:59.242049] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dbd80 name raid_bdev1, state offline 00:10:11.321 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:11.321 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:10:11.578 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:10:11.578 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:10:11.578 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@532 -- # '[' 2 -gt 2 ']' 00:10:11.578 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:11.837 [2024-05-15 04:10:59.735157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:11.837 [2024-05-15 04:10:59.735220] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:11.837 [2024-05-15 04:10:59.735245] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8de180 00:10:11.837 [2024-05-15 04:10:59.735258] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:11.837 [2024-05-15 04:10:59.736721] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:11.837 [2024-05-15 04:10:59.736744] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:11.837 [2024-05-15 04:10:59.736862] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:11.837 [2024-05-15 04:10:59.736900] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:11.837 [2024-05-15 04:10:59.737008] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:10:11.837 [2024-05-15 04:10:59.737023] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:11.837 [2024-05-15 04:10:59.737037] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8dcfa0 name raid_bdev1, state configuring 00:10:11.837 [2024-05-15 04:10:59.737061] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:11.837 [2024-05-15 04:10:59.737144] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x8ddc30 00:10:11.837 [2024-05-15 04:10:59.737157] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:10:11.837 [2024-05-15 04:10:59.737297] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8dc690 00:10:11.837 [2024-05-15 04:10:59.737420] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8ddc30 00:10:11.837 [2024-05-15 04:10:59.737433] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x8ddc30 00:10:11.837 [2024-05-15 04:10:59.737519] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:11.837 pt1 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # '[' 2 -gt 2 ']' 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:11.837 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:12.095 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:12.095 "name": "raid_bdev1", 00:10:12.095 "uuid": "3f11934a-cf1e-495a-9fd0-686d0e0d1a88", 00:10:12.095 "strip_size_kb": 0, 00:10:12.095 "state": "online", 00:10:12.095 "raid_level": "raid1", 00:10:12.095 "superblock": true, 00:10:12.095 "num_base_bdevs": 2, 00:10:12.095 "num_base_bdevs_discovered": 1, 00:10:12.095 "num_base_bdevs_operational": 1, 00:10:12.095 "base_bdevs_list": [ 00:10:12.095 { 00:10:12.095 "name": null, 00:10:12.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:12.095 "is_configured": false, 00:10:12.095 "data_offset": 2048, 00:10:12.095 "data_size": 63488 00:10:12.095 }, 00:10:12.095 { 00:10:12.095 "name": "pt2", 00:10:12.095 "uuid": "c956e420-845e-5d32-91c3-22174e4ee58c", 00:10:12.095 "is_configured": true, 00:10:12.095 "data_offset": 2048, 00:10:12.095 "data_size": 63488 00:10:12.095 } 00:10:12.095 ] 00:10:12.095 }' 00:10:12.095 04:10:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:12.095 04:10:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:12.660 04:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:10:12.660 04:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:10:12.917 04:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:10:12.917 04:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:12.917 04:11:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:10:13.174 [2024-05-15 04:11:01.006725] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # '[' 3f11934a-cf1e-495a-9fd0-686d0e0d1a88 '!=' 3f11934a-cf1e-495a-9fd0-686d0e0d1a88 ']' 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3836123 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3836123 ']' 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3836123 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3836123 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3836123' 00:10:13.174 killing process with pid 3836123 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3836123 00:10:13.174 [2024-05-15 04:11:01.052743] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:13.174 [2024-05-15 04:11:01.052853] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:13.174 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3836123 00:10:13.174 [2024-05-15 04:11:01.052926] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:13.174 [2024-05-15 04:11:01.052942] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8ddc30 name raid_bdev1, state offline 00:10:13.174 [2024-05-15 04:11:01.074655] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:13.432 04:11:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:10:13.432 00:10:13.432 real 0m15.524s 00:10:13.432 user 0m28.628s 00:10:13.432 sys 0m2.208s 00:10:13.432 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:13.432 04:11:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:13.432 ************************************ 00:10:13.432 END TEST raid_superblock_test 00:10:13.432 ************************************ 00:10:13.432 04:11:01 bdev_raid -- bdev/bdev_raid.sh@801 -- # for n in {2..4} 00:10:13.432 04:11:01 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:10:13.432 04:11:01 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:10:13.432 04:11:01 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:13.432 04:11:01 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:13.432 04:11:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:13.432 ************************************ 00:10:13.432 START TEST raid_state_function_test 00:10:13.432 ************************************ 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 false 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3838310 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3838310' 00:10:13.432 Process raid pid: 3838310 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3838310 /var/tmp/spdk-raid.sock 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3838310 ']' 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:13.432 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:13.432 04:11:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:13.690 [2024-05-15 04:11:01.448278] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:10:13.690 [2024-05-15 04:11:01.448361] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:13.690 [2024-05-15 04:11:01.531572] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:13.690 [2024-05-15 04:11:01.648795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:13.947 [2024-05-15 04:11:01.719618] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:13.947 [2024-05-15 04:11:01.719665] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:14.512 04:11:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:14.512 04:11:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:10:14.512 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:14.769 [2024-05-15 04:11:02.643129] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:14.769 [2024-05-15 04:11:02.643176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:14.769 [2024-05-15 04:11:02.643189] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:14.769 [2024-05-15 04:11:02.643203] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:14.769 [2024-05-15 04:11:02.643212] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:14.769 [2024-05-15 04:11:02.643224] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.769 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:15.026 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:15.026 "name": "Existed_Raid", 00:10:15.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:15.026 "strip_size_kb": 64, 00:10:15.026 "state": "configuring", 00:10:15.026 "raid_level": "raid0", 00:10:15.026 "superblock": false, 00:10:15.026 "num_base_bdevs": 3, 00:10:15.027 "num_base_bdevs_discovered": 0, 00:10:15.027 "num_base_bdevs_operational": 3, 00:10:15.027 "base_bdevs_list": [ 00:10:15.027 { 00:10:15.027 "name": "BaseBdev1", 00:10:15.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:15.027 "is_configured": false, 00:10:15.027 "data_offset": 0, 00:10:15.027 "data_size": 0 00:10:15.027 }, 00:10:15.027 { 00:10:15.027 "name": "BaseBdev2", 00:10:15.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:15.027 "is_configured": false, 00:10:15.027 "data_offset": 0, 00:10:15.027 "data_size": 0 00:10:15.027 }, 00:10:15.027 { 00:10:15.027 "name": "BaseBdev3", 00:10:15.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:15.027 "is_configured": false, 00:10:15.027 "data_offset": 0, 00:10:15.027 "data_size": 0 00:10:15.027 } 00:10:15.027 ] 00:10:15.027 }' 00:10:15.027 04:11:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:15.027 04:11:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:15.603 04:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:15.863 [2024-05-15 04:11:03.685768] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:15.863 [2024-05-15 04:11:03.685804] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88b020 name Existed_Raid, state configuring 00:10:15.863 04:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:16.121 [2024-05-15 04:11:03.942473] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:16.121 [2024-05-15 04:11:03.942517] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:16.121 [2024-05-15 04:11:03.942539] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:16.121 [2024-05-15 04:11:03.942549] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:16.121 [2024-05-15 04:11:03.942556] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:16.121 [2024-05-15 04:11:03.942566] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:16.121 04:11:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:16.379 [2024-05-15 04:11:04.193910] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:16.379 BaseBdev1 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:16.379 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:16.637 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:16.895 [ 00:10:16.895 { 00:10:16.895 "name": "BaseBdev1", 00:10:16.895 "aliases": [ 00:10:16.895 "58b07642-67ed-479e-b5ca-70c3acfcf06c" 00:10:16.895 ], 00:10:16.895 "product_name": "Malloc disk", 00:10:16.895 "block_size": 512, 00:10:16.895 "num_blocks": 65536, 00:10:16.895 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:16.895 "assigned_rate_limits": { 00:10:16.895 "rw_ios_per_sec": 0, 00:10:16.895 "rw_mbytes_per_sec": 0, 00:10:16.895 "r_mbytes_per_sec": 0, 00:10:16.895 "w_mbytes_per_sec": 0 00:10:16.895 }, 00:10:16.895 "claimed": true, 00:10:16.895 "claim_type": "exclusive_write", 00:10:16.895 "zoned": false, 00:10:16.895 "supported_io_types": { 00:10:16.895 "read": true, 00:10:16.895 "write": true, 00:10:16.895 "unmap": true, 00:10:16.895 "write_zeroes": true, 00:10:16.895 "flush": true, 00:10:16.895 "reset": true, 00:10:16.895 "compare": false, 00:10:16.895 "compare_and_write": false, 00:10:16.895 "abort": true, 00:10:16.895 "nvme_admin": false, 00:10:16.895 "nvme_io": false 00:10:16.895 }, 00:10:16.895 "memory_domains": [ 00:10:16.895 { 00:10:16.895 "dma_device_id": "system", 00:10:16.895 "dma_device_type": 1 00:10:16.895 }, 00:10:16.895 { 00:10:16.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:16.895 "dma_device_type": 2 00:10:16.895 } 00:10:16.895 ], 00:10:16.895 "driver_specific": {} 00:10:16.895 } 00:10:16.895 ] 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:16.895 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:17.154 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:17.154 "name": "Existed_Raid", 00:10:17.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.154 "strip_size_kb": 64, 00:10:17.154 "state": "configuring", 00:10:17.154 "raid_level": "raid0", 00:10:17.154 "superblock": false, 00:10:17.154 "num_base_bdevs": 3, 00:10:17.154 "num_base_bdevs_discovered": 1, 00:10:17.154 "num_base_bdevs_operational": 3, 00:10:17.154 "base_bdevs_list": [ 00:10:17.154 { 00:10:17.154 "name": "BaseBdev1", 00:10:17.154 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:17.154 "is_configured": true, 00:10:17.154 "data_offset": 0, 00:10:17.154 "data_size": 65536 00:10:17.154 }, 00:10:17.154 { 00:10:17.154 "name": "BaseBdev2", 00:10:17.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.154 "is_configured": false, 00:10:17.154 "data_offset": 0, 00:10:17.154 "data_size": 0 00:10:17.154 }, 00:10:17.154 { 00:10:17.154 "name": "BaseBdev3", 00:10:17.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:17.154 "is_configured": false, 00:10:17.154 "data_offset": 0, 00:10:17.154 "data_size": 0 00:10:17.154 } 00:10:17.154 ] 00:10:17.154 }' 00:10:17.154 04:11:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:17.154 04:11:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:17.718 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:17.718 [2024-05-15 04:11:05.669749] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:17.718 [2024-05-15 04:11:05.669797] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88a8f0 name Existed_Raid, state configuring 00:10:17.718 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:17.976 [2024-05-15 04:11:05.902390] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:17.976 [2024-05-15 04:11:05.903631] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:17.976 [2024-05-15 04:11:05.903659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:17.976 [2024-05-15 04:11:05.903680] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:17.976 [2024-05-15 04:11:05.903690] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.976 04:11:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:18.234 04:11:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:18.234 "name": "Existed_Raid", 00:10:18.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.234 "strip_size_kb": 64, 00:10:18.234 "state": "configuring", 00:10:18.234 "raid_level": "raid0", 00:10:18.234 "superblock": false, 00:10:18.234 "num_base_bdevs": 3, 00:10:18.234 "num_base_bdevs_discovered": 1, 00:10:18.234 "num_base_bdevs_operational": 3, 00:10:18.234 "base_bdevs_list": [ 00:10:18.234 { 00:10:18.234 "name": "BaseBdev1", 00:10:18.234 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:18.234 "is_configured": true, 00:10:18.234 "data_offset": 0, 00:10:18.234 "data_size": 65536 00:10:18.234 }, 00:10:18.234 { 00:10:18.234 "name": "BaseBdev2", 00:10:18.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.234 "is_configured": false, 00:10:18.234 "data_offset": 0, 00:10:18.234 "data_size": 0 00:10:18.234 }, 00:10:18.234 { 00:10:18.234 "name": "BaseBdev3", 00:10:18.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:18.234 "is_configured": false, 00:10:18.234 "data_offset": 0, 00:10:18.234 "data_size": 0 00:10:18.234 } 00:10:18.234 ] 00:10:18.234 }' 00:10:18.234 04:11:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:18.234 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.799 04:11:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:19.056 [2024-05-15 04:11:06.929645] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:19.056 BaseBdev2 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:19.056 04:11:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:19.314 04:11:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:19.572 [ 00:10:19.572 { 00:10:19.572 "name": "BaseBdev2", 00:10:19.572 "aliases": [ 00:10:19.572 "12ce79ad-3907-4a40-88d5-3faf5bddbc0f" 00:10:19.572 ], 00:10:19.572 "product_name": "Malloc disk", 00:10:19.572 "block_size": 512, 00:10:19.572 "num_blocks": 65536, 00:10:19.572 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:19.572 "assigned_rate_limits": { 00:10:19.572 "rw_ios_per_sec": 0, 00:10:19.572 "rw_mbytes_per_sec": 0, 00:10:19.572 "r_mbytes_per_sec": 0, 00:10:19.572 "w_mbytes_per_sec": 0 00:10:19.572 }, 00:10:19.572 "claimed": true, 00:10:19.572 "claim_type": "exclusive_write", 00:10:19.572 "zoned": false, 00:10:19.572 "supported_io_types": { 00:10:19.572 "read": true, 00:10:19.572 "write": true, 00:10:19.572 "unmap": true, 00:10:19.572 "write_zeroes": true, 00:10:19.572 "flush": true, 00:10:19.572 "reset": true, 00:10:19.572 "compare": false, 00:10:19.572 "compare_and_write": false, 00:10:19.572 "abort": true, 00:10:19.572 "nvme_admin": false, 00:10:19.572 "nvme_io": false 00:10:19.572 }, 00:10:19.572 "memory_domains": [ 00:10:19.572 { 00:10:19.572 "dma_device_id": "system", 00:10:19.572 "dma_device_type": 1 00:10:19.572 }, 00:10:19.572 { 00:10:19.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:19.572 "dma_device_type": 2 00:10:19.572 } 00:10:19.572 ], 00:10:19.572 "driver_specific": {} 00:10:19.572 } 00:10:19.572 ] 00:10:19.572 04:11:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:19.572 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:19.572 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:19.572 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:19.572 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:19.573 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:19.838 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:19.838 "name": "Existed_Raid", 00:10:19.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.839 "strip_size_kb": 64, 00:10:19.839 "state": "configuring", 00:10:19.839 "raid_level": "raid0", 00:10:19.839 "superblock": false, 00:10:19.839 "num_base_bdevs": 3, 00:10:19.839 "num_base_bdevs_discovered": 2, 00:10:19.839 "num_base_bdevs_operational": 3, 00:10:19.839 "base_bdevs_list": [ 00:10:19.839 { 00:10:19.839 "name": "BaseBdev1", 00:10:19.839 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:19.839 "is_configured": true, 00:10:19.839 "data_offset": 0, 00:10:19.839 "data_size": 65536 00:10:19.839 }, 00:10:19.839 { 00:10:19.839 "name": "BaseBdev2", 00:10:19.839 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:19.839 "is_configured": true, 00:10:19.839 "data_offset": 0, 00:10:19.839 "data_size": 65536 00:10:19.839 }, 00:10:19.839 { 00:10:19.839 "name": "BaseBdev3", 00:10:19.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:19.839 "is_configured": false, 00:10:19.839 "data_offset": 0, 00:10:19.839 "data_size": 0 00:10:19.839 } 00:10:19.839 ] 00:10:19.839 }' 00:10:19.839 04:11:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:19.839 04:11:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.404 04:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:20.662 [2024-05-15 04:11:08.547451] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:20.662 [2024-05-15 04:11:08.547512] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x88b7e0 00:10:20.662 [2024-05-15 04:11:08.547522] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:10:20.662 [2024-05-15 04:11:08.547725] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8a26d0 00:10:20.662 [2024-05-15 04:11:08.547911] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x88b7e0 00:10:20.662 [2024-05-15 04:11:08.547929] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x88b7e0 00:10:20.662 [2024-05-15 04:11:08.548169] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:20.662 BaseBdev3 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:20.662 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:20.920 04:11:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:21.178 [ 00:10:21.178 { 00:10:21.178 "name": "BaseBdev3", 00:10:21.178 "aliases": [ 00:10:21.178 "c215a411-b3f3-4933-bee0-ce895bb753e8" 00:10:21.178 ], 00:10:21.178 "product_name": "Malloc disk", 00:10:21.178 "block_size": 512, 00:10:21.178 "num_blocks": 65536, 00:10:21.178 "uuid": "c215a411-b3f3-4933-bee0-ce895bb753e8", 00:10:21.178 "assigned_rate_limits": { 00:10:21.178 "rw_ios_per_sec": 0, 00:10:21.178 "rw_mbytes_per_sec": 0, 00:10:21.178 "r_mbytes_per_sec": 0, 00:10:21.178 "w_mbytes_per_sec": 0 00:10:21.178 }, 00:10:21.178 "claimed": true, 00:10:21.178 "claim_type": "exclusive_write", 00:10:21.178 "zoned": false, 00:10:21.178 "supported_io_types": { 00:10:21.178 "read": true, 00:10:21.178 "write": true, 00:10:21.178 "unmap": true, 00:10:21.178 "write_zeroes": true, 00:10:21.178 "flush": true, 00:10:21.178 "reset": true, 00:10:21.178 "compare": false, 00:10:21.178 "compare_and_write": false, 00:10:21.178 "abort": true, 00:10:21.178 "nvme_admin": false, 00:10:21.178 "nvme_io": false 00:10:21.178 }, 00:10:21.178 "memory_domains": [ 00:10:21.178 { 00:10:21.178 "dma_device_id": "system", 00:10:21.178 "dma_device_type": 1 00:10:21.178 }, 00:10:21.178 { 00:10:21.178 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.178 "dma_device_type": 2 00:10:21.178 } 00:10:21.178 ], 00:10:21.178 "driver_specific": {} 00:10:21.178 } 00:10:21.178 ] 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.178 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:21.434 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:21.434 "name": "Existed_Raid", 00:10:21.434 "uuid": "1a5ecd21-495e-4728-b222-88bc49f342be", 00:10:21.434 "strip_size_kb": 64, 00:10:21.434 "state": "online", 00:10:21.434 "raid_level": "raid0", 00:10:21.434 "superblock": false, 00:10:21.434 "num_base_bdevs": 3, 00:10:21.434 "num_base_bdevs_discovered": 3, 00:10:21.434 "num_base_bdevs_operational": 3, 00:10:21.434 "base_bdevs_list": [ 00:10:21.434 { 00:10:21.434 "name": "BaseBdev1", 00:10:21.434 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:21.434 "is_configured": true, 00:10:21.434 "data_offset": 0, 00:10:21.434 "data_size": 65536 00:10:21.434 }, 00:10:21.434 { 00:10:21.434 "name": "BaseBdev2", 00:10:21.434 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:21.434 "is_configured": true, 00:10:21.434 "data_offset": 0, 00:10:21.434 "data_size": 65536 00:10:21.434 }, 00:10:21.434 { 00:10:21.434 "name": "BaseBdev3", 00:10:21.434 "uuid": "c215a411-b3f3-4933-bee0-ce895bb753e8", 00:10:21.434 "is_configured": true, 00:10:21.434 "data_offset": 0, 00:10:21.434 "data_size": 65536 00:10:21.434 } 00:10:21.434 ] 00:10:21.434 }' 00:10:21.434 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:21.434 04:11:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:21.999 04:11:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:22.258 [2024-05-15 04:11:10.059631] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:22.258 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:22.258 "name": "Existed_Raid", 00:10:22.258 "aliases": [ 00:10:22.258 "1a5ecd21-495e-4728-b222-88bc49f342be" 00:10:22.258 ], 00:10:22.258 "product_name": "Raid Volume", 00:10:22.258 "block_size": 512, 00:10:22.258 "num_blocks": 196608, 00:10:22.258 "uuid": "1a5ecd21-495e-4728-b222-88bc49f342be", 00:10:22.258 "assigned_rate_limits": { 00:10:22.258 "rw_ios_per_sec": 0, 00:10:22.258 "rw_mbytes_per_sec": 0, 00:10:22.258 "r_mbytes_per_sec": 0, 00:10:22.258 "w_mbytes_per_sec": 0 00:10:22.258 }, 00:10:22.258 "claimed": false, 00:10:22.258 "zoned": false, 00:10:22.258 "supported_io_types": { 00:10:22.258 "read": true, 00:10:22.258 "write": true, 00:10:22.258 "unmap": true, 00:10:22.258 "write_zeroes": true, 00:10:22.258 "flush": true, 00:10:22.258 "reset": true, 00:10:22.258 "compare": false, 00:10:22.258 "compare_and_write": false, 00:10:22.258 "abort": false, 00:10:22.258 "nvme_admin": false, 00:10:22.258 "nvme_io": false 00:10:22.258 }, 00:10:22.258 "memory_domains": [ 00:10:22.258 { 00:10:22.258 "dma_device_id": "system", 00:10:22.258 "dma_device_type": 1 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.258 "dma_device_type": 2 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "dma_device_id": "system", 00:10:22.258 "dma_device_type": 1 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.258 "dma_device_type": 2 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "dma_device_id": "system", 00:10:22.258 "dma_device_type": 1 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.258 "dma_device_type": 2 00:10:22.258 } 00:10:22.258 ], 00:10:22.258 "driver_specific": { 00:10:22.258 "raid": { 00:10:22.258 "uuid": "1a5ecd21-495e-4728-b222-88bc49f342be", 00:10:22.258 "strip_size_kb": 64, 00:10:22.258 "state": "online", 00:10:22.258 "raid_level": "raid0", 00:10:22.258 "superblock": false, 00:10:22.258 "num_base_bdevs": 3, 00:10:22.258 "num_base_bdevs_discovered": 3, 00:10:22.258 "num_base_bdevs_operational": 3, 00:10:22.258 "base_bdevs_list": [ 00:10:22.258 { 00:10:22.258 "name": "BaseBdev1", 00:10:22.258 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:22.258 "is_configured": true, 00:10:22.258 "data_offset": 0, 00:10:22.258 "data_size": 65536 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "name": "BaseBdev2", 00:10:22.258 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:22.258 "is_configured": true, 00:10:22.258 "data_offset": 0, 00:10:22.258 "data_size": 65536 00:10:22.258 }, 00:10:22.258 { 00:10:22.258 "name": "BaseBdev3", 00:10:22.258 "uuid": "c215a411-b3f3-4933-bee0-ce895bb753e8", 00:10:22.258 "is_configured": true, 00:10:22.258 "data_offset": 0, 00:10:22.258 "data_size": 65536 00:10:22.258 } 00:10:22.258 ] 00:10:22.258 } 00:10:22.258 } 00:10:22.258 }' 00:10:22.258 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:22.259 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:22.259 BaseBdev2 00:10:22.259 BaseBdev3' 00:10:22.259 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:22.259 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:22.259 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:22.516 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:22.516 "name": "BaseBdev1", 00:10:22.516 "aliases": [ 00:10:22.516 "58b07642-67ed-479e-b5ca-70c3acfcf06c" 00:10:22.516 ], 00:10:22.516 "product_name": "Malloc disk", 00:10:22.516 "block_size": 512, 00:10:22.516 "num_blocks": 65536, 00:10:22.516 "uuid": "58b07642-67ed-479e-b5ca-70c3acfcf06c", 00:10:22.516 "assigned_rate_limits": { 00:10:22.516 "rw_ios_per_sec": 0, 00:10:22.516 "rw_mbytes_per_sec": 0, 00:10:22.516 "r_mbytes_per_sec": 0, 00:10:22.516 "w_mbytes_per_sec": 0 00:10:22.517 }, 00:10:22.517 "claimed": true, 00:10:22.517 "claim_type": "exclusive_write", 00:10:22.517 "zoned": false, 00:10:22.517 "supported_io_types": { 00:10:22.517 "read": true, 00:10:22.517 "write": true, 00:10:22.517 "unmap": true, 00:10:22.517 "write_zeroes": true, 00:10:22.517 "flush": true, 00:10:22.517 "reset": true, 00:10:22.517 "compare": false, 00:10:22.517 "compare_and_write": false, 00:10:22.517 "abort": true, 00:10:22.517 "nvme_admin": false, 00:10:22.517 "nvme_io": false 00:10:22.517 }, 00:10:22.517 "memory_domains": [ 00:10:22.517 { 00:10:22.517 "dma_device_id": "system", 00:10:22.517 "dma_device_type": 1 00:10:22.517 }, 00:10:22.517 { 00:10:22.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:22.517 "dma_device_type": 2 00:10:22.517 } 00:10:22.517 ], 00:10:22.517 "driver_specific": {} 00:10:22.517 }' 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:22.517 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:22.774 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:23.032 "name": "BaseBdev2", 00:10:23.032 "aliases": [ 00:10:23.032 "12ce79ad-3907-4a40-88d5-3faf5bddbc0f" 00:10:23.032 ], 00:10:23.032 "product_name": "Malloc disk", 00:10:23.032 "block_size": 512, 00:10:23.032 "num_blocks": 65536, 00:10:23.032 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:23.032 "assigned_rate_limits": { 00:10:23.032 "rw_ios_per_sec": 0, 00:10:23.032 "rw_mbytes_per_sec": 0, 00:10:23.032 "r_mbytes_per_sec": 0, 00:10:23.032 "w_mbytes_per_sec": 0 00:10:23.032 }, 00:10:23.032 "claimed": true, 00:10:23.032 "claim_type": "exclusive_write", 00:10:23.032 "zoned": false, 00:10:23.032 "supported_io_types": { 00:10:23.032 "read": true, 00:10:23.032 "write": true, 00:10:23.032 "unmap": true, 00:10:23.032 "write_zeroes": true, 00:10:23.032 "flush": true, 00:10:23.032 "reset": true, 00:10:23.032 "compare": false, 00:10:23.032 "compare_and_write": false, 00:10:23.032 "abort": true, 00:10:23.032 "nvme_admin": false, 00:10:23.032 "nvme_io": false 00:10:23.032 }, 00:10:23.032 "memory_domains": [ 00:10:23.032 { 00:10:23.032 "dma_device_id": "system", 00:10:23.032 "dma_device_type": 1 00:10:23.032 }, 00:10:23.032 { 00:10:23.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:23.032 "dma_device_type": 2 00:10:23.032 } 00:10:23.032 ], 00:10:23.032 "driver_specific": {} 00:10:23.032 }' 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:23.032 04:11:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:23.032 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:23.032 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:10:23.290 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:23.549 "name": "BaseBdev3", 00:10:23.549 "aliases": [ 00:10:23.549 "c215a411-b3f3-4933-bee0-ce895bb753e8" 00:10:23.549 ], 00:10:23.549 "product_name": "Malloc disk", 00:10:23.549 "block_size": 512, 00:10:23.549 "num_blocks": 65536, 00:10:23.549 "uuid": "c215a411-b3f3-4933-bee0-ce895bb753e8", 00:10:23.549 "assigned_rate_limits": { 00:10:23.549 "rw_ios_per_sec": 0, 00:10:23.549 "rw_mbytes_per_sec": 0, 00:10:23.549 "r_mbytes_per_sec": 0, 00:10:23.549 "w_mbytes_per_sec": 0 00:10:23.549 }, 00:10:23.549 "claimed": true, 00:10:23.549 "claim_type": "exclusive_write", 00:10:23.549 "zoned": false, 00:10:23.549 "supported_io_types": { 00:10:23.549 "read": true, 00:10:23.549 "write": true, 00:10:23.549 "unmap": true, 00:10:23.549 "write_zeroes": true, 00:10:23.549 "flush": true, 00:10:23.549 "reset": true, 00:10:23.549 "compare": false, 00:10:23.549 "compare_and_write": false, 00:10:23.549 "abort": true, 00:10:23.549 "nvme_admin": false, 00:10:23.549 "nvme_io": false 00:10:23.549 }, 00:10:23.549 "memory_domains": [ 00:10:23.549 { 00:10:23.549 "dma_device_id": "system", 00:10:23.549 "dma_device_type": 1 00:10:23.549 }, 00:10:23.549 { 00:10:23.549 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:23.549 "dma_device_type": 2 00:10:23.549 } 00:10:23.549 ], 00:10:23.549 "driver_specific": {} 00:10:23.549 }' 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:23.549 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:23.806 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:23.806 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:23.806 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:23.806 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:23.806 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:24.063 [2024-05-15 04:11:11.864297] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:24.063 [2024-05-15 04:11:11.864329] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:24.063 [2024-05-15 04:11:11.864373] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:24.063 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.064 04:11:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:24.321 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:24.321 "name": "Existed_Raid", 00:10:24.321 "uuid": "1a5ecd21-495e-4728-b222-88bc49f342be", 00:10:24.321 "strip_size_kb": 64, 00:10:24.321 "state": "offline", 00:10:24.321 "raid_level": "raid0", 00:10:24.321 "superblock": false, 00:10:24.321 "num_base_bdevs": 3, 00:10:24.321 "num_base_bdevs_discovered": 2, 00:10:24.321 "num_base_bdevs_operational": 2, 00:10:24.321 "base_bdevs_list": [ 00:10:24.321 { 00:10:24.321 "name": null, 00:10:24.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:24.321 "is_configured": false, 00:10:24.321 "data_offset": 0, 00:10:24.321 "data_size": 65536 00:10:24.321 }, 00:10:24.321 { 00:10:24.321 "name": "BaseBdev2", 00:10:24.321 "uuid": "12ce79ad-3907-4a40-88d5-3faf5bddbc0f", 00:10:24.321 "is_configured": true, 00:10:24.321 "data_offset": 0, 00:10:24.321 "data_size": 65536 00:10:24.321 }, 00:10:24.321 { 00:10:24.321 "name": "BaseBdev3", 00:10:24.321 "uuid": "c215a411-b3f3-4933-bee0-ce895bb753e8", 00:10:24.321 "is_configured": true, 00:10:24.321 "data_offset": 0, 00:10:24.321 "data_size": 65536 00:10:24.321 } 00:10:24.321 ] 00:10:24.321 }' 00:10:24.321 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:24.321 04:11:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.885 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:24.885 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:24.885 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.885 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:25.142 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:25.142 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:25.142 04:11:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:25.142 [2024-05-15 04:11:13.145455] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:25.400 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:10:25.657 [2024-05-15 04:11:13.622090] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:10:25.657 [2024-05-15 04:11:13.622169] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x88b7e0 name Existed_Raid, state offline 00:10:25.657 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:25.657 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:25.657 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:25.657 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:25.927 04:11:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:26.220 BaseBdev2 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:26.220 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:26.478 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:26.735 [ 00:10:26.735 { 00:10:26.735 "name": "BaseBdev2", 00:10:26.735 "aliases": [ 00:10:26.735 "917acb0e-a714-4515-b331-95850ad277f0" 00:10:26.735 ], 00:10:26.735 "product_name": "Malloc disk", 00:10:26.735 "block_size": 512, 00:10:26.735 "num_blocks": 65536, 00:10:26.735 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:26.735 "assigned_rate_limits": { 00:10:26.735 "rw_ios_per_sec": 0, 00:10:26.735 "rw_mbytes_per_sec": 0, 00:10:26.735 "r_mbytes_per_sec": 0, 00:10:26.735 "w_mbytes_per_sec": 0 00:10:26.735 }, 00:10:26.735 "claimed": false, 00:10:26.736 "zoned": false, 00:10:26.736 "supported_io_types": { 00:10:26.736 "read": true, 00:10:26.736 "write": true, 00:10:26.736 "unmap": true, 00:10:26.736 "write_zeroes": true, 00:10:26.736 "flush": true, 00:10:26.736 "reset": true, 00:10:26.736 "compare": false, 00:10:26.736 "compare_and_write": false, 00:10:26.736 "abort": true, 00:10:26.736 "nvme_admin": false, 00:10:26.736 "nvme_io": false 00:10:26.736 }, 00:10:26.736 "memory_domains": [ 00:10:26.736 { 00:10:26.736 "dma_device_id": "system", 00:10:26.736 "dma_device_type": 1 00:10:26.736 }, 00:10:26.736 { 00:10:26.736 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.736 "dma_device_type": 2 00:10:26.736 } 00:10:26.736 ], 00:10:26.736 "driver_specific": {} 00:10:26.736 } 00:10:26.736 ] 00:10:26.736 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:26.736 04:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:10:26.736 04:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:26.736 04:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:26.994 BaseBdev3 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:26.994 04:11:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:27.252 04:11:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:27.511 [ 00:10:27.511 { 00:10:27.511 "name": "BaseBdev3", 00:10:27.511 "aliases": [ 00:10:27.511 "299ac8dd-b8ba-433d-985e-cfbca17ae1a9" 00:10:27.511 ], 00:10:27.511 "product_name": "Malloc disk", 00:10:27.511 "block_size": 512, 00:10:27.511 "num_blocks": 65536, 00:10:27.511 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:27.511 "assigned_rate_limits": { 00:10:27.512 "rw_ios_per_sec": 0, 00:10:27.512 "rw_mbytes_per_sec": 0, 00:10:27.512 "r_mbytes_per_sec": 0, 00:10:27.512 "w_mbytes_per_sec": 0 00:10:27.512 }, 00:10:27.512 "claimed": false, 00:10:27.512 "zoned": false, 00:10:27.512 "supported_io_types": { 00:10:27.512 "read": true, 00:10:27.512 "write": true, 00:10:27.512 "unmap": true, 00:10:27.512 "write_zeroes": true, 00:10:27.512 "flush": true, 00:10:27.512 "reset": true, 00:10:27.512 "compare": false, 00:10:27.512 "compare_and_write": false, 00:10:27.512 "abort": true, 00:10:27.512 "nvme_admin": false, 00:10:27.512 "nvme_io": false 00:10:27.512 }, 00:10:27.512 "memory_domains": [ 00:10:27.512 { 00:10:27.512 "dma_device_id": "system", 00:10:27.512 "dma_device_type": 1 00:10:27.512 }, 00:10:27.512 { 00:10:27.512 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.512 "dma_device_type": 2 00:10:27.512 } 00:10:27.512 ], 00:10:27.512 "driver_specific": {} 00:10:27.512 } 00:10:27.512 ] 00:10:27.512 04:11:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:27.512 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:10:27.512 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:27.512 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:27.770 [2024-05-15 04:11:15.534746] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:27.770 [2024-05-15 04:11:15.534791] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:27.770 [2024-05-15 04:11:15.534817] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:27.770 [2024-05-15 04:11:15.536046] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.770 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:28.028 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:28.028 "name": "Existed_Raid", 00:10:28.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.028 "strip_size_kb": 64, 00:10:28.028 "state": "configuring", 00:10:28.028 "raid_level": "raid0", 00:10:28.028 "superblock": false, 00:10:28.028 "num_base_bdevs": 3, 00:10:28.028 "num_base_bdevs_discovered": 2, 00:10:28.028 "num_base_bdevs_operational": 3, 00:10:28.028 "base_bdevs_list": [ 00:10:28.028 { 00:10:28.028 "name": "BaseBdev1", 00:10:28.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.028 "is_configured": false, 00:10:28.028 "data_offset": 0, 00:10:28.028 "data_size": 0 00:10:28.028 }, 00:10:28.028 { 00:10:28.028 "name": "BaseBdev2", 00:10:28.028 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:28.028 "is_configured": true, 00:10:28.028 "data_offset": 0, 00:10:28.028 "data_size": 65536 00:10:28.028 }, 00:10:28.028 { 00:10:28.028 "name": "BaseBdev3", 00:10:28.028 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:28.028 "is_configured": true, 00:10:28.028 "data_offset": 0, 00:10:28.028 "data_size": 65536 00:10:28.028 } 00:10:28.028 ] 00:10:28.028 }' 00:10:28.028 04:11:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:28.028 04:11:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:10:28.593 [2024-05-15 04:11:16.553463] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.593 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:28.851 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:28.851 "name": "Existed_Raid", 00:10:28.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.851 "strip_size_kb": 64, 00:10:28.851 "state": "configuring", 00:10:28.851 "raid_level": "raid0", 00:10:28.851 "superblock": false, 00:10:28.851 "num_base_bdevs": 3, 00:10:28.851 "num_base_bdevs_discovered": 1, 00:10:28.851 "num_base_bdevs_operational": 3, 00:10:28.851 "base_bdevs_list": [ 00:10:28.851 { 00:10:28.851 "name": "BaseBdev1", 00:10:28.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:28.851 "is_configured": false, 00:10:28.851 "data_offset": 0, 00:10:28.851 "data_size": 0 00:10:28.851 }, 00:10:28.851 { 00:10:28.851 "name": null, 00:10:28.851 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:28.851 "is_configured": false, 00:10:28.851 "data_offset": 0, 00:10:28.851 "data_size": 65536 00:10:28.851 }, 00:10:28.851 { 00:10:28.851 "name": "BaseBdev3", 00:10:28.851 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:28.851 "is_configured": true, 00:10:28.851 "data_offset": 0, 00:10:28.851 "data_size": 65536 00:10:28.851 } 00:10:28.851 ] 00:10:28.851 }' 00:10:28.851 04:11:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:28.851 04:11:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.418 04:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:29.418 04:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:10:29.676 04:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:10:29.676 04:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:29.935 [2024-05-15 04:11:17.790391] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:29.935 BaseBdev1 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:29.935 04:11:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:30.193 04:11:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:30.451 [ 00:10:30.452 { 00:10:30.452 "name": "BaseBdev1", 00:10:30.452 "aliases": [ 00:10:30.452 "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d" 00:10:30.452 ], 00:10:30.452 "product_name": "Malloc disk", 00:10:30.452 "block_size": 512, 00:10:30.452 "num_blocks": 65536, 00:10:30.452 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:30.452 "assigned_rate_limits": { 00:10:30.452 "rw_ios_per_sec": 0, 00:10:30.452 "rw_mbytes_per_sec": 0, 00:10:30.452 "r_mbytes_per_sec": 0, 00:10:30.452 "w_mbytes_per_sec": 0 00:10:30.452 }, 00:10:30.452 "claimed": true, 00:10:30.452 "claim_type": "exclusive_write", 00:10:30.452 "zoned": false, 00:10:30.452 "supported_io_types": { 00:10:30.452 "read": true, 00:10:30.452 "write": true, 00:10:30.452 "unmap": true, 00:10:30.452 "write_zeroes": true, 00:10:30.452 "flush": true, 00:10:30.452 "reset": true, 00:10:30.452 "compare": false, 00:10:30.452 "compare_and_write": false, 00:10:30.452 "abort": true, 00:10:30.452 "nvme_admin": false, 00:10:30.452 "nvme_io": false 00:10:30.452 }, 00:10:30.452 "memory_domains": [ 00:10:30.452 { 00:10:30.452 "dma_device_id": "system", 00:10:30.452 "dma_device_type": 1 00:10:30.452 }, 00:10:30.452 { 00:10:30.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:30.452 "dma_device_type": 2 00:10:30.452 } 00:10:30.452 ], 00:10:30.452 "driver_specific": {} 00:10:30.452 } 00:10:30.452 ] 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.452 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.710 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:30.710 "name": "Existed_Raid", 00:10:30.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.710 "strip_size_kb": 64, 00:10:30.710 "state": "configuring", 00:10:30.710 "raid_level": "raid0", 00:10:30.710 "superblock": false, 00:10:30.710 "num_base_bdevs": 3, 00:10:30.710 "num_base_bdevs_discovered": 2, 00:10:30.710 "num_base_bdevs_operational": 3, 00:10:30.710 "base_bdevs_list": [ 00:10:30.710 { 00:10:30.710 "name": "BaseBdev1", 00:10:30.710 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:30.710 "is_configured": true, 00:10:30.710 "data_offset": 0, 00:10:30.710 "data_size": 65536 00:10:30.710 }, 00:10:30.710 { 00:10:30.710 "name": null, 00:10:30.710 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:30.710 "is_configured": false, 00:10:30.710 "data_offset": 0, 00:10:30.710 "data_size": 65536 00:10:30.710 }, 00:10:30.710 { 00:10:30.710 "name": "BaseBdev3", 00:10:30.710 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:30.710 "is_configured": true, 00:10:30.710 "data_offset": 0, 00:10:30.710 "data_size": 65536 00:10:30.710 } 00:10:30.710 ] 00:10:30.710 }' 00:10:30.710 04:11:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:30.710 04:11:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:31.275 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.275 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:10:31.533 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:10:31.533 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:10:31.791 [2024-05-15 04:11:19.587320] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:31.791 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:31.792 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:31.792 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:31.792 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:31.792 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.050 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:32.050 "name": "Existed_Raid", 00:10:32.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.050 "strip_size_kb": 64, 00:10:32.050 "state": "configuring", 00:10:32.050 "raid_level": "raid0", 00:10:32.050 "superblock": false, 00:10:32.050 "num_base_bdevs": 3, 00:10:32.050 "num_base_bdevs_discovered": 1, 00:10:32.050 "num_base_bdevs_operational": 3, 00:10:32.050 "base_bdevs_list": [ 00:10:32.050 { 00:10:32.050 "name": "BaseBdev1", 00:10:32.050 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:32.050 "is_configured": true, 00:10:32.050 "data_offset": 0, 00:10:32.050 "data_size": 65536 00:10:32.050 }, 00:10:32.050 { 00:10:32.050 "name": null, 00:10:32.050 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:32.050 "is_configured": false, 00:10:32.050 "data_offset": 0, 00:10:32.050 "data_size": 65536 00:10:32.050 }, 00:10:32.050 { 00:10:32.050 "name": null, 00:10:32.050 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:32.050 "is_configured": false, 00:10:32.050 "data_offset": 0, 00:10:32.050 "data_size": 65536 00:10:32.050 } 00:10:32.050 ] 00:10:32.050 }' 00:10:32.050 04:11:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:32.050 04:11:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:32.616 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.616 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:10:32.874 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:10:32.874 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:10:33.132 [2024-05-15 04:11:20.922913] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.132 04:11:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:33.390 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:33.390 "name": "Existed_Raid", 00:10:33.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:33.390 "strip_size_kb": 64, 00:10:33.390 "state": "configuring", 00:10:33.390 "raid_level": "raid0", 00:10:33.390 "superblock": false, 00:10:33.390 "num_base_bdevs": 3, 00:10:33.390 "num_base_bdevs_discovered": 2, 00:10:33.390 "num_base_bdevs_operational": 3, 00:10:33.390 "base_bdevs_list": [ 00:10:33.390 { 00:10:33.390 "name": "BaseBdev1", 00:10:33.390 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:33.390 "is_configured": true, 00:10:33.390 "data_offset": 0, 00:10:33.390 "data_size": 65536 00:10:33.390 }, 00:10:33.390 { 00:10:33.390 "name": null, 00:10:33.390 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:33.390 "is_configured": false, 00:10:33.390 "data_offset": 0, 00:10:33.390 "data_size": 65536 00:10:33.390 }, 00:10:33.390 { 00:10:33.390 "name": "BaseBdev3", 00:10:33.390 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:33.390 "is_configured": true, 00:10:33.390 "data_offset": 0, 00:10:33.390 "data_size": 65536 00:10:33.390 } 00:10:33.390 ] 00:10:33.390 }' 00:10:33.390 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:33.390 04:11:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:33.956 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.956 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:10:33.956 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:10:33.956 04:11:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:34.214 [2024-05-15 04:11:22.178224] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:34.214 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.471 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:34.471 "name": "Existed_Raid", 00:10:34.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:34.471 "strip_size_kb": 64, 00:10:34.471 "state": "configuring", 00:10:34.471 "raid_level": "raid0", 00:10:34.471 "superblock": false, 00:10:34.471 "num_base_bdevs": 3, 00:10:34.471 "num_base_bdevs_discovered": 1, 00:10:34.471 "num_base_bdevs_operational": 3, 00:10:34.471 "base_bdevs_list": [ 00:10:34.471 { 00:10:34.471 "name": null, 00:10:34.471 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:34.471 "is_configured": false, 00:10:34.471 "data_offset": 0, 00:10:34.471 "data_size": 65536 00:10:34.471 }, 00:10:34.471 { 00:10:34.471 "name": null, 00:10:34.471 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:34.471 "is_configured": false, 00:10:34.471 "data_offset": 0, 00:10:34.471 "data_size": 65536 00:10:34.471 }, 00:10:34.471 { 00:10:34.471 "name": "BaseBdev3", 00:10:34.471 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:34.471 "is_configured": true, 00:10:34.471 "data_offset": 0, 00:10:34.471 "data_size": 65536 00:10:34.471 } 00:10:34.471 ] 00:10:34.471 }' 00:10:34.471 04:11:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:34.471 04:11:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:35.037 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.037 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:10:35.295 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:10:35.295 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:10:35.553 [2024-05-15 04:11:23.481817] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.553 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.811 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:35.811 "name": "Existed_Raid", 00:10:35.811 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:35.811 "strip_size_kb": 64, 00:10:35.811 "state": "configuring", 00:10:35.811 "raid_level": "raid0", 00:10:35.811 "superblock": false, 00:10:35.811 "num_base_bdevs": 3, 00:10:35.811 "num_base_bdevs_discovered": 2, 00:10:35.811 "num_base_bdevs_operational": 3, 00:10:35.811 "base_bdevs_list": [ 00:10:35.811 { 00:10:35.811 "name": null, 00:10:35.811 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:35.811 "is_configured": false, 00:10:35.811 "data_offset": 0, 00:10:35.811 "data_size": 65536 00:10:35.811 }, 00:10:35.811 { 00:10:35.811 "name": "BaseBdev2", 00:10:35.811 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:35.811 "is_configured": true, 00:10:35.811 "data_offset": 0, 00:10:35.811 "data_size": 65536 00:10:35.811 }, 00:10:35.811 { 00:10:35.811 "name": "BaseBdev3", 00:10:35.811 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:35.811 "is_configured": true, 00:10:35.811 "data_offset": 0, 00:10:35.811 "data_size": 65536 00:10:35.811 } 00:10:35.811 ] 00:10:35.811 }' 00:10:35.811 04:11:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:35.811 04:11:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:36.377 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.377 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:10:36.635 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:10:36.635 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:36.635 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:10:36.893 04:11:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ef3a257f-bc67-4a12-8f66-5e4c03eeae9d 00:10:37.152 [2024-05-15 04:11:25.022865] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:10:37.152 [2024-05-15 04:11:25.022913] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xa2f1f0 00:10:37.152 [2024-05-15 04:11:25.022922] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:10:37.152 [2024-05-15 04:11:25.023093] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x88aff0 00:10:37.152 [2024-05-15 04:11:25.023233] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa2f1f0 00:10:37.152 [2024-05-15 04:11:25.023247] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa2f1f0 00:10:37.152 [2024-05-15 04:11:25.023453] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:37.152 NewBaseBdev 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:37.152 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:37.421 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:10:37.683 [ 00:10:37.683 { 00:10:37.683 "name": "NewBaseBdev", 00:10:37.683 "aliases": [ 00:10:37.683 "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d" 00:10:37.683 ], 00:10:37.683 "product_name": "Malloc disk", 00:10:37.683 "block_size": 512, 00:10:37.684 "num_blocks": 65536, 00:10:37.684 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:37.684 "assigned_rate_limits": { 00:10:37.684 "rw_ios_per_sec": 0, 00:10:37.684 "rw_mbytes_per_sec": 0, 00:10:37.684 "r_mbytes_per_sec": 0, 00:10:37.684 "w_mbytes_per_sec": 0 00:10:37.684 }, 00:10:37.684 "claimed": true, 00:10:37.684 "claim_type": "exclusive_write", 00:10:37.684 "zoned": false, 00:10:37.684 "supported_io_types": { 00:10:37.684 "read": true, 00:10:37.684 "write": true, 00:10:37.684 "unmap": true, 00:10:37.684 "write_zeroes": true, 00:10:37.684 "flush": true, 00:10:37.684 "reset": true, 00:10:37.684 "compare": false, 00:10:37.684 "compare_and_write": false, 00:10:37.684 "abort": true, 00:10:37.684 "nvme_admin": false, 00:10:37.684 "nvme_io": false 00:10:37.684 }, 00:10:37.684 "memory_domains": [ 00:10:37.684 { 00:10:37.684 "dma_device_id": "system", 00:10:37.684 "dma_device_type": 1 00:10:37.684 }, 00:10:37.684 { 00:10:37.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.684 "dma_device_type": 2 00:10:37.684 } 00:10:37.684 ], 00:10:37.684 "driver_specific": {} 00:10:37.684 } 00:10:37.684 ] 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:37.684 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:37.942 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:37.942 "name": "Existed_Raid", 00:10:37.942 "uuid": "029f354e-bfd9-4917-a2ce-35d73f65330b", 00:10:37.942 "strip_size_kb": 64, 00:10:37.942 "state": "online", 00:10:37.942 "raid_level": "raid0", 00:10:37.942 "superblock": false, 00:10:37.942 "num_base_bdevs": 3, 00:10:37.942 "num_base_bdevs_discovered": 3, 00:10:37.942 "num_base_bdevs_operational": 3, 00:10:37.942 "base_bdevs_list": [ 00:10:37.942 { 00:10:37.942 "name": "NewBaseBdev", 00:10:37.942 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:37.942 "is_configured": true, 00:10:37.942 "data_offset": 0, 00:10:37.942 "data_size": 65536 00:10:37.942 }, 00:10:37.942 { 00:10:37.943 "name": "BaseBdev2", 00:10:37.943 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:37.943 "is_configured": true, 00:10:37.943 "data_offset": 0, 00:10:37.943 "data_size": 65536 00:10:37.943 }, 00:10:37.943 { 00:10:37.943 "name": "BaseBdev3", 00:10:37.943 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:37.943 "is_configured": true, 00:10:37.943 "data_offset": 0, 00:10:37.943 "data_size": 65536 00:10:37.943 } 00:10:37.943 ] 00:10:37.943 }' 00:10:37.943 04:11:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:37.943 04:11:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:38.506 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:38.805 [2024-05-15 04:11:26.571256] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:38.805 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:38.805 "name": "Existed_Raid", 00:10:38.805 "aliases": [ 00:10:38.805 "029f354e-bfd9-4917-a2ce-35d73f65330b" 00:10:38.805 ], 00:10:38.805 "product_name": "Raid Volume", 00:10:38.805 "block_size": 512, 00:10:38.805 "num_blocks": 196608, 00:10:38.805 "uuid": "029f354e-bfd9-4917-a2ce-35d73f65330b", 00:10:38.805 "assigned_rate_limits": { 00:10:38.805 "rw_ios_per_sec": 0, 00:10:38.805 "rw_mbytes_per_sec": 0, 00:10:38.805 "r_mbytes_per_sec": 0, 00:10:38.805 "w_mbytes_per_sec": 0 00:10:38.805 }, 00:10:38.805 "claimed": false, 00:10:38.805 "zoned": false, 00:10:38.805 "supported_io_types": { 00:10:38.805 "read": true, 00:10:38.805 "write": true, 00:10:38.805 "unmap": true, 00:10:38.805 "write_zeroes": true, 00:10:38.805 "flush": true, 00:10:38.805 "reset": true, 00:10:38.805 "compare": false, 00:10:38.805 "compare_and_write": false, 00:10:38.805 "abort": false, 00:10:38.805 "nvme_admin": false, 00:10:38.805 "nvme_io": false 00:10:38.805 }, 00:10:38.805 "memory_domains": [ 00:10:38.805 { 00:10:38.805 "dma_device_id": "system", 00:10:38.806 "dma_device_type": 1 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:38.806 "dma_device_type": 2 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "dma_device_id": "system", 00:10:38.806 "dma_device_type": 1 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:38.806 "dma_device_type": 2 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "dma_device_id": "system", 00:10:38.806 "dma_device_type": 1 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:38.806 "dma_device_type": 2 00:10:38.806 } 00:10:38.806 ], 00:10:38.806 "driver_specific": { 00:10:38.806 "raid": { 00:10:38.806 "uuid": "029f354e-bfd9-4917-a2ce-35d73f65330b", 00:10:38.806 "strip_size_kb": 64, 00:10:38.806 "state": "online", 00:10:38.806 "raid_level": "raid0", 00:10:38.806 "superblock": false, 00:10:38.806 "num_base_bdevs": 3, 00:10:38.806 "num_base_bdevs_discovered": 3, 00:10:38.806 "num_base_bdevs_operational": 3, 00:10:38.806 "base_bdevs_list": [ 00:10:38.806 { 00:10:38.806 "name": "NewBaseBdev", 00:10:38.806 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:38.806 "is_configured": true, 00:10:38.806 "data_offset": 0, 00:10:38.806 "data_size": 65536 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "name": "BaseBdev2", 00:10:38.806 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:38.806 "is_configured": true, 00:10:38.806 "data_offset": 0, 00:10:38.806 "data_size": 65536 00:10:38.806 }, 00:10:38.806 { 00:10:38.806 "name": "BaseBdev3", 00:10:38.806 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:38.806 "is_configured": true, 00:10:38.806 "data_offset": 0, 00:10:38.806 "data_size": 65536 00:10:38.806 } 00:10:38.806 ] 00:10:38.806 } 00:10:38.806 } 00:10:38.806 }' 00:10:38.806 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:38.806 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:10:38.806 BaseBdev2 00:10:38.806 BaseBdev3' 00:10:38.806 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:38.806 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:10:38.806 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:39.063 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:39.063 "name": "NewBaseBdev", 00:10:39.063 "aliases": [ 00:10:39.063 "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d" 00:10:39.063 ], 00:10:39.063 "product_name": "Malloc disk", 00:10:39.063 "block_size": 512, 00:10:39.063 "num_blocks": 65536, 00:10:39.063 "uuid": "ef3a257f-bc67-4a12-8f66-5e4c03eeae9d", 00:10:39.064 "assigned_rate_limits": { 00:10:39.064 "rw_ios_per_sec": 0, 00:10:39.064 "rw_mbytes_per_sec": 0, 00:10:39.064 "r_mbytes_per_sec": 0, 00:10:39.064 "w_mbytes_per_sec": 0 00:10:39.064 }, 00:10:39.064 "claimed": true, 00:10:39.064 "claim_type": "exclusive_write", 00:10:39.064 "zoned": false, 00:10:39.064 "supported_io_types": { 00:10:39.064 "read": true, 00:10:39.064 "write": true, 00:10:39.064 "unmap": true, 00:10:39.064 "write_zeroes": true, 00:10:39.064 "flush": true, 00:10:39.064 "reset": true, 00:10:39.064 "compare": false, 00:10:39.064 "compare_and_write": false, 00:10:39.064 "abort": true, 00:10:39.064 "nvme_admin": false, 00:10:39.064 "nvme_io": false 00:10:39.064 }, 00:10:39.064 "memory_domains": [ 00:10:39.064 { 00:10:39.064 "dma_device_id": "system", 00:10:39.064 "dma_device_type": 1 00:10:39.064 }, 00:10:39.064 { 00:10:39.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.064 "dma_device_type": 2 00:10:39.064 } 00:10:39.064 ], 00:10:39.064 "driver_specific": {} 00:10:39.064 }' 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:39.064 04:11:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:39.064 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:39.064 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:39.064 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:39.321 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:39.321 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:39.321 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:39.321 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:39.321 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:39.579 "name": "BaseBdev2", 00:10:39.579 "aliases": [ 00:10:39.579 "917acb0e-a714-4515-b331-95850ad277f0" 00:10:39.579 ], 00:10:39.579 "product_name": "Malloc disk", 00:10:39.579 "block_size": 512, 00:10:39.579 "num_blocks": 65536, 00:10:39.579 "uuid": "917acb0e-a714-4515-b331-95850ad277f0", 00:10:39.579 "assigned_rate_limits": { 00:10:39.579 "rw_ios_per_sec": 0, 00:10:39.579 "rw_mbytes_per_sec": 0, 00:10:39.579 "r_mbytes_per_sec": 0, 00:10:39.579 "w_mbytes_per_sec": 0 00:10:39.579 }, 00:10:39.579 "claimed": true, 00:10:39.579 "claim_type": "exclusive_write", 00:10:39.579 "zoned": false, 00:10:39.579 "supported_io_types": { 00:10:39.579 "read": true, 00:10:39.579 "write": true, 00:10:39.579 "unmap": true, 00:10:39.579 "write_zeroes": true, 00:10:39.579 "flush": true, 00:10:39.579 "reset": true, 00:10:39.579 "compare": false, 00:10:39.579 "compare_and_write": false, 00:10:39.579 "abort": true, 00:10:39.579 "nvme_admin": false, 00:10:39.579 "nvme_io": false 00:10:39.579 }, 00:10:39.579 "memory_domains": [ 00:10:39.579 { 00:10:39.579 "dma_device_id": "system", 00:10:39.579 "dma_device_type": 1 00:10:39.579 }, 00:10:39.579 { 00:10:39.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:39.579 "dma_device_type": 2 00:10:39.579 } 00:10:39.579 ], 00:10:39.579 "driver_specific": {} 00:10:39.579 }' 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:39.579 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:39.837 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:39.838 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:39.838 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:39.838 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:10:39.838 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:40.096 "name": "BaseBdev3", 00:10:40.096 "aliases": [ 00:10:40.096 "299ac8dd-b8ba-433d-985e-cfbca17ae1a9" 00:10:40.096 ], 00:10:40.096 "product_name": "Malloc disk", 00:10:40.096 "block_size": 512, 00:10:40.096 "num_blocks": 65536, 00:10:40.096 "uuid": "299ac8dd-b8ba-433d-985e-cfbca17ae1a9", 00:10:40.096 "assigned_rate_limits": { 00:10:40.096 "rw_ios_per_sec": 0, 00:10:40.096 "rw_mbytes_per_sec": 0, 00:10:40.096 "r_mbytes_per_sec": 0, 00:10:40.096 "w_mbytes_per_sec": 0 00:10:40.096 }, 00:10:40.096 "claimed": true, 00:10:40.096 "claim_type": "exclusive_write", 00:10:40.096 "zoned": false, 00:10:40.096 "supported_io_types": { 00:10:40.096 "read": true, 00:10:40.096 "write": true, 00:10:40.096 "unmap": true, 00:10:40.096 "write_zeroes": true, 00:10:40.096 "flush": true, 00:10:40.096 "reset": true, 00:10:40.096 "compare": false, 00:10:40.096 "compare_and_write": false, 00:10:40.096 "abort": true, 00:10:40.096 "nvme_admin": false, 00:10:40.096 "nvme_io": false 00:10:40.096 }, 00:10:40.096 "memory_domains": [ 00:10:40.096 { 00:10:40.096 "dma_device_id": "system", 00:10:40.096 "dma_device_type": 1 00:10:40.096 }, 00:10:40.096 { 00:10:40.096 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:40.096 "dma_device_type": 2 00:10:40.096 } 00:10:40.096 ], 00:10:40.096 "driver_specific": {} 00:10:40.096 }' 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:40.096 04:11:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:40.096 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:40.096 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:40.096 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:40.096 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:40.096 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:40.355 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:40.355 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:40.355 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:40.613 [2024-05-15 04:11:28.375904] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:40.613 [2024-05-15 04:11:28.375935] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:40.613 [2024-05-15 04:11:28.376015] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:40.613 [2024-05-15 04:11:28.376085] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:40.613 [2024-05-15 04:11:28.376098] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa2f1f0 name Existed_Raid, state offline 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3838310 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3838310 ']' 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3838310 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3838310 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3838310' 00:10:40.613 killing process with pid 3838310 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3838310 00:10:40.613 [2024-05-15 04:11:28.414600] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:40.613 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3838310 00:10:40.613 [2024-05-15 04:11:28.445792] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:10:40.872 00:10:40.872 real 0m27.300s 00:10:40.872 user 0m50.880s 00:10:40.872 sys 0m3.790s 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:40.872 ************************************ 00:10:40.872 END TEST raid_state_function_test 00:10:40.872 ************************************ 00:10:40.872 04:11:28 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:10:40.872 04:11:28 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:10:40.872 04:11:28 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:10:40.872 04:11:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:40.872 ************************************ 00:10:40.872 START TEST raid_state_function_test_sb 00:10:40.872 ************************************ 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 3 true 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3842228 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3842228' 00:10:40.872 Process raid pid: 3842228 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3842228 /var/tmp/spdk-raid.sock 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3842228 ']' 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:40.872 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:10:40.872 04:11:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:40.872 [2024-05-15 04:11:28.798208] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:10:40.873 [2024-05-15 04:11:28.798279] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:40.873 [2024-05-15 04:11:28.882097] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:41.131 [2024-05-15 04:11:28.999480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.131 [2024-05-15 04:11:29.065850] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.131 [2024-05-15 04:11:29.065892] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:41.131 04:11:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:10:41.131 04:11:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:10:41.131 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:41.390 [2024-05-15 04:11:29.352899] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:41.390 [2024-05-15 04:11:29.352939] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:41.390 [2024-05-15 04:11:29.352950] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:41.390 [2024-05-15 04:11:29.352961] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:41.390 [2024-05-15 04:11:29.352969] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:41.390 [2024-05-15 04:11:29.352979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:41.390 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:41.649 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:41.649 "name": "Existed_Raid", 00:10:41.649 "uuid": "b5c009eb-091a-407c-b851-38957567ae02", 00:10:41.649 "strip_size_kb": 64, 00:10:41.649 "state": "configuring", 00:10:41.649 "raid_level": "raid0", 00:10:41.649 "superblock": true, 00:10:41.649 "num_base_bdevs": 3, 00:10:41.649 "num_base_bdevs_discovered": 0, 00:10:41.649 "num_base_bdevs_operational": 3, 00:10:41.649 "base_bdevs_list": [ 00:10:41.649 { 00:10:41.649 "name": "BaseBdev1", 00:10:41.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.649 "is_configured": false, 00:10:41.649 "data_offset": 0, 00:10:41.649 "data_size": 0 00:10:41.649 }, 00:10:41.649 { 00:10:41.649 "name": "BaseBdev2", 00:10:41.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.649 "is_configured": false, 00:10:41.649 "data_offset": 0, 00:10:41.649 "data_size": 0 00:10:41.649 }, 00:10:41.649 { 00:10:41.649 "name": "BaseBdev3", 00:10:41.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:41.649 "is_configured": false, 00:10:41.649 "data_offset": 0, 00:10:41.649 "data_size": 0 00:10:41.649 } 00:10:41.649 ] 00:10:41.649 }' 00:10:41.649 04:11:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:41.649 04:11:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:42.216 04:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:42.474 [2024-05-15 04:11:30.439689] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:42.474 [2024-05-15 04:11:30.439715] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea5020 name Existed_Raid, state configuring 00:10:42.474 04:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:42.732 [2024-05-15 04:11:30.684373] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:42.732 [2024-05-15 04:11:30.684413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:42.732 [2024-05-15 04:11:30.684423] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:42.732 [2024-05-15 04:11:30.684433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:42.732 [2024-05-15 04:11:30.684441] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:42.732 [2024-05-15 04:11:30.684450] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:42.732 04:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:42.990 [2024-05-15 04:11:30.941294] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:42.990 BaseBdev1 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:42.990 04:11:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:43.291 04:11:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:43.561 [ 00:10:43.561 { 00:10:43.561 "name": "BaseBdev1", 00:10:43.561 "aliases": [ 00:10:43.561 "5dc670a5-849a-4f60-b6d9-6e9788f5a287" 00:10:43.561 ], 00:10:43.561 "product_name": "Malloc disk", 00:10:43.561 "block_size": 512, 00:10:43.561 "num_blocks": 65536, 00:10:43.561 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:43.561 "assigned_rate_limits": { 00:10:43.561 "rw_ios_per_sec": 0, 00:10:43.561 "rw_mbytes_per_sec": 0, 00:10:43.561 "r_mbytes_per_sec": 0, 00:10:43.561 "w_mbytes_per_sec": 0 00:10:43.561 }, 00:10:43.561 "claimed": true, 00:10:43.561 "claim_type": "exclusive_write", 00:10:43.561 "zoned": false, 00:10:43.561 "supported_io_types": { 00:10:43.561 "read": true, 00:10:43.561 "write": true, 00:10:43.561 "unmap": true, 00:10:43.561 "write_zeroes": true, 00:10:43.561 "flush": true, 00:10:43.561 "reset": true, 00:10:43.561 "compare": false, 00:10:43.561 "compare_and_write": false, 00:10:43.561 "abort": true, 00:10:43.561 "nvme_admin": false, 00:10:43.561 "nvme_io": false 00:10:43.561 }, 00:10:43.561 "memory_domains": [ 00:10:43.561 { 00:10:43.561 "dma_device_id": "system", 00:10:43.561 "dma_device_type": 1 00:10:43.561 }, 00:10:43.561 { 00:10:43.561 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.561 "dma_device_type": 2 00:10:43.561 } 00:10:43.561 ], 00:10:43.561 "driver_specific": {} 00:10:43.561 } 00:10:43.561 ] 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:43.561 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:43.818 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:43.818 "name": "Existed_Raid", 00:10:43.818 "uuid": "aefe7f03-d0fc-4462-9952-7f49bca9fe54", 00:10:43.818 "strip_size_kb": 64, 00:10:43.818 "state": "configuring", 00:10:43.818 "raid_level": "raid0", 00:10:43.818 "superblock": true, 00:10:43.818 "num_base_bdevs": 3, 00:10:43.818 "num_base_bdevs_discovered": 1, 00:10:43.818 "num_base_bdevs_operational": 3, 00:10:43.818 "base_bdevs_list": [ 00:10:43.818 { 00:10:43.818 "name": "BaseBdev1", 00:10:43.818 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:43.818 "is_configured": true, 00:10:43.818 "data_offset": 2048, 00:10:43.818 "data_size": 63488 00:10:43.818 }, 00:10:43.818 { 00:10:43.818 "name": "BaseBdev2", 00:10:43.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.818 "is_configured": false, 00:10:43.818 "data_offset": 0, 00:10:43.818 "data_size": 0 00:10:43.818 }, 00:10:43.818 { 00:10:43.818 "name": "BaseBdev3", 00:10:43.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:43.818 "is_configured": false, 00:10:43.818 "data_offset": 0, 00:10:43.818 "data_size": 0 00:10:43.818 } 00:10:43.818 ] 00:10:43.818 }' 00:10:43.818 04:11:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:43.818 04:11:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:44.381 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:44.638 [2024-05-15 04:11:32.529479] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:44.638 [2024-05-15 04:11:32.529528] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea48f0 name Existed_Raid, state configuring 00:10:44.638 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:44.895 [2024-05-15 04:11:32.770161] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:44.895 [2024-05-15 04:11:32.771655] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:44.895 [2024-05-15 04:11:32.771688] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:44.895 [2024-05-15 04:11:32.771709] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:10:44.895 [2024-05-15 04:11:32.771722] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:44.895 04:11:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:45.153 04:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:45.153 "name": "Existed_Raid", 00:10:45.153 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:45.153 "strip_size_kb": 64, 00:10:45.153 "state": "configuring", 00:10:45.153 "raid_level": "raid0", 00:10:45.153 "superblock": true, 00:10:45.153 "num_base_bdevs": 3, 00:10:45.153 "num_base_bdevs_discovered": 1, 00:10:45.153 "num_base_bdevs_operational": 3, 00:10:45.153 "base_bdevs_list": [ 00:10:45.153 { 00:10:45.153 "name": "BaseBdev1", 00:10:45.153 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:45.153 "is_configured": true, 00:10:45.153 "data_offset": 2048, 00:10:45.153 "data_size": 63488 00:10:45.153 }, 00:10:45.153 { 00:10:45.153 "name": "BaseBdev2", 00:10:45.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.153 "is_configured": false, 00:10:45.153 "data_offset": 0, 00:10:45.153 "data_size": 0 00:10:45.153 }, 00:10:45.153 { 00:10:45.153 "name": "BaseBdev3", 00:10:45.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:45.153 "is_configured": false, 00:10:45.153 "data_offset": 0, 00:10:45.153 "data_size": 0 00:10:45.153 } 00:10:45.153 ] 00:10:45.153 }' 00:10:45.153 04:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:45.153 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:45.717 04:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:45.975 [2024-05-15 04:11:33.802340] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:45.975 BaseBdev2 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:45.975 04:11:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:46.232 04:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:46.490 [ 00:10:46.490 { 00:10:46.490 "name": "BaseBdev2", 00:10:46.490 "aliases": [ 00:10:46.490 "78588570-890c-40aa-b62d-b861f6f90c95" 00:10:46.490 ], 00:10:46.490 "product_name": "Malloc disk", 00:10:46.490 "block_size": 512, 00:10:46.490 "num_blocks": 65536, 00:10:46.490 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:46.490 "assigned_rate_limits": { 00:10:46.490 "rw_ios_per_sec": 0, 00:10:46.490 "rw_mbytes_per_sec": 0, 00:10:46.490 "r_mbytes_per_sec": 0, 00:10:46.490 "w_mbytes_per_sec": 0 00:10:46.490 }, 00:10:46.490 "claimed": true, 00:10:46.490 "claim_type": "exclusive_write", 00:10:46.490 "zoned": false, 00:10:46.490 "supported_io_types": { 00:10:46.490 "read": true, 00:10:46.490 "write": true, 00:10:46.490 "unmap": true, 00:10:46.490 "write_zeroes": true, 00:10:46.490 "flush": true, 00:10:46.490 "reset": true, 00:10:46.490 "compare": false, 00:10:46.490 "compare_and_write": false, 00:10:46.490 "abort": true, 00:10:46.490 "nvme_admin": false, 00:10:46.490 "nvme_io": false 00:10:46.490 }, 00:10:46.490 "memory_domains": [ 00:10:46.490 { 00:10:46.490 "dma_device_id": "system", 00:10:46.490 "dma_device_type": 1 00:10:46.490 }, 00:10:46.490 { 00:10:46.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:46.490 "dma_device_type": 2 00:10:46.490 } 00:10:46.490 ], 00:10:46.490 "driver_specific": {} 00:10:46.490 } 00:10:46.490 ] 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.490 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:46.748 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:46.748 "name": "Existed_Raid", 00:10:46.748 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:46.748 "strip_size_kb": 64, 00:10:46.748 "state": "configuring", 00:10:46.748 "raid_level": "raid0", 00:10:46.748 "superblock": true, 00:10:46.748 "num_base_bdevs": 3, 00:10:46.748 "num_base_bdevs_discovered": 2, 00:10:46.748 "num_base_bdevs_operational": 3, 00:10:46.748 "base_bdevs_list": [ 00:10:46.748 { 00:10:46.748 "name": "BaseBdev1", 00:10:46.748 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:46.748 "is_configured": true, 00:10:46.748 "data_offset": 2048, 00:10:46.748 "data_size": 63488 00:10:46.748 }, 00:10:46.748 { 00:10:46.748 "name": "BaseBdev2", 00:10:46.748 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:46.748 "is_configured": true, 00:10:46.748 "data_offset": 2048, 00:10:46.748 "data_size": 63488 00:10:46.748 }, 00:10:46.748 { 00:10:46.748 "name": "BaseBdev3", 00:10:46.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:46.748 "is_configured": false, 00:10:46.748 "data_offset": 0, 00:10:46.748 "data_size": 0 00:10:46.748 } 00:10:46.748 ] 00:10:46.748 }' 00:10:46.748 04:11:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:46.748 04:11:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:47.312 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:47.570 [2024-05-15 04:11:35.379121] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:47.570 [2024-05-15 04:11:35.379362] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ea57e0 00:10:47.570 [2024-05-15 04:11:35.379377] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:10:47.570 [2024-05-15 04:11:35.379527] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ebc6d0 00:10:47.570 [2024-05-15 04:11:35.379652] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ea57e0 00:10:47.570 [2024-05-15 04:11:35.379666] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ea57e0 00:10:47.570 [2024-05-15 04:11:35.379754] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:47.570 BaseBdev3 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:47.570 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:47.828 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:48.085 [ 00:10:48.085 { 00:10:48.085 "name": "BaseBdev3", 00:10:48.085 "aliases": [ 00:10:48.086 "6876fd47-4bd9-448d-a6d1-bdfefe4de96a" 00:10:48.086 ], 00:10:48.086 "product_name": "Malloc disk", 00:10:48.086 "block_size": 512, 00:10:48.086 "num_blocks": 65536, 00:10:48.086 "uuid": "6876fd47-4bd9-448d-a6d1-bdfefe4de96a", 00:10:48.086 "assigned_rate_limits": { 00:10:48.086 "rw_ios_per_sec": 0, 00:10:48.086 "rw_mbytes_per_sec": 0, 00:10:48.086 "r_mbytes_per_sec": 0, 00:10:48.086 "w_mbytes_per_sec": 0 00:10:48.086 }, 00:10:48.086 "claimed": true, 00:10:48.086 "claim_type": "exclusive_write", 00:10:48.086 "zoned": false, 00:10:48.086 "supported_io_types": { 00:10:48.086 "read": true, 00:10:48.086 "write": true, 00:10:48.086 "unmap": true, 00:10:48.086 "write_zeroes": true, 00:10:48.086 "flush": true, 00:10:48.086 "reset": true, 00:10:48.086 "compare": false, 00:10:48.086 "compare_and_write": false, 00:10:48.086 "abort": true, 00:10:48.086 "nvme_admin": false, 00:10:48.086 "nvme_io": false 00:10:48.086 }, 00:10:48.086 "memory_domains": [ 00:10:48.086 { 00:10:48.086 "dma_device_id": "system", 00:10:48.086 "dma_device_type": 1 00:10:48.086 }, 00:10:48.086 { 00:10:48.086 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.086 "dma_device_type": 2 00:10:48.086 } 00:10:48.086 ], 00:10:48.086 "driver_specific": {} 00:10:48.086 } 00:10:48.086 ] 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.086 04:11:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:48.343 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:48.343 "name": "Existed_Raid", 00:10:48.343 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:48.343 "strip_size_kb": 64, 00:10:48.343 "state": "online", 00:10:48.343 "raid_level": "raid0", 00:10:48.343 "superblock": true, 00:10:48.343 "num_base_bdevs": 3, 00:10:48.343 "num_base_bdevs_discovered": 3, 00:10:48.343 "num_base_bdevs_operational": 3, 00:10:48.343 "base_bdevs_list": [ 00:10:48.343 { 00:10:48.343 "name": "BaseBdev1", 00:10:48.343 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:48.343 "is_configured": true, 00:10:48.343 "data_offset": 2048, 00:10:48.343 "data_size": 63488 00:10:48.343 }, 00:10:48.343 { 00:10:48.343 "name": "BaseBdev2", 00:10:48.343 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:48.343 "is_configured": true, 00:10:48.343 "data_offset": 2048, 00:10:48.343 "data_size": 63488 00:10:48.343 }, 00:10:48.343 { 00:10:48.343 "name": "BaseBdev3", 00:10:48.343 "uuid": "6876fd47-4bd9-448d-a6d1-bdfefe4de96a", 00:10:48.343 "is_configured": true, 00:10:48.343 "data_offset": 2048, 00:10:48.343 "data_size": 63488 00:10:48.343 } 00:10:48.343 ] 00:10:48.343 }' 00:10:48.344 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:48.344 04:11:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:48.908 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:10:49.165 [2024-05-15 04:11:36.951542] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:49.165 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:10:49.165 "name": "Existed_Raid", 00:10:49.165 "aliases": [ 00:10:49.165 "3dfc6bd4-c64f-472f-a54a-c8deb9201886" 00:10:49.165 ], 00:10:49.165 "product_name": "Raid Volume", 00:10:49.165 "block_size": 512, 00:10:49.165 "num_blocks": 190464, 00:10:49.165 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:49.165 "assigned_rate_limits": { 00:10:49.165 "rw_ios_per_sec": 0, 00:10:49.165 "rw_mbytes_per_sec": 0, 00:10:49.165 "r_mbytes_per_sec": 0, 00:10:49.165 "w_mbytes_per_sec": 0 00:10:49.165 }, 00:10:49.165 "claimed": false, 00:10:49.165 "zoned": false, 00:10:49.165 "supported_io_types": { 00:10:49.165 "read": true, 00:10:49.165 "write": true, 00:10:49.165 "unmap": true, 00:10:49.165 "write_zeroes": true, 00:10:49.165 "flush": true, 00:10:49.165 "reset": true, 00:10:49.165 "compare": false, 00:10:49.165 "compare_and_write": false, 00:10:49.165 "abort": false, 00:10:49.165 "nvme_admin": false, 00:10:49.165 "nvme_io": false 00:10:49.165 }, 00:10:49.165 "memory_domains": [ 00:10:49.165 { 00:10:49.165 "dma_device_id": "system", 00:10:49.165 "dma_device_type": 1 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.165 "dma_device_type": 2 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "dma_device_id": "system", 00:10:49.165 "dma_device_type": 1 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.165 "dma_device_type": 2 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "dma_device_id": "system", 00:10:49.165 "dma_device_type": 1 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.165 "dma_device_type": 2 00:10:49.165 } 00:10:49.165 ], 00:10:49.165 "driver_specific": { 00:10:49.165 "raid": { 00:10:49.165 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:49.165 "strip_size_kb": 64, 00:10:49.165 "state": "online", 00:10:49.165 "raid_level": "raid0", 00:10:49.165 "superblock": true, 00:10:49.165 "num_base_bdevs": 3, 00:10:49.165 "num_base_bdevs_discovered": 3, 00:10:49.165 "num_base_bdevs_operational": 3, 00:10:49.165 "base_bdevs_list": [ 00:10:49.165 { 00:10:49.165 "name": "BaseBdev1", 00:10:49.165 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:49.165 "is_configured": true, 00:10:49.165 "data_offset": 2048, 00:10:49.165 "data_size": 63488 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "name": "BaseBdev2", 00:10:49.165 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:49.165 "is_configured": true, 00:10:49.165 "data_offset": 2048, 00:10:49.165 "data_size": 63488 00:10:49.165 }, 00:10:49.165 { 00:10:49.165 "name": "BaseBdev3", 00:10:49.165 "uuid": "6876fd47-4bd9-448d-a6d1-bdfefe4de96a", 00:10:49.165 "is_configured": true, 00:10:49.165 "data_offset": 2048, 00:10:49.165 "data_size": 63488 00:10:49.165 } 00:10:49.165 ] 00:10:49.165 } 00:10:49.165 } 00:10:49.165 }' 00:10:49.165 04:11:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:49.165 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:10:49.165 BaseBdev2 00:10:49.165 BaseBdev3' 00:10:49.165 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:49.165 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:49.165 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:49.423 "name": "BaseBdev1", 00:10:49.423 "aliases": [ 00:10:49.423 "5dc670a5-849a-4f60-b6d9-6e9788f5a287" 00:10:49.423 ], 00:10:49.423 "product_name": "Malloc disk", 00:10:49.423 "block_size": 512, 00:10:49.423 "num_blocks": 65536, 00:10:49.423 "uuid": "5dc670a5-849a-4f60-b6d9-6e9788f5a287", 00:10:49.423 "assigned_rate_limits": { 00:10:49.423 "rw_ios_per_sec": 0, 00:10:49.423 "rw_mbytes_per_sec": 0, 00:10:49.423 "r_mbytes_per_sec": 0, 00:10:49.423 "w_mbytes_per_sec": 0 00:10:49.423 }, 00:10:49.423 "claimed": true, 00:10:49.423 "claim_type": "exclusive_write", 00:10:49.423 "zoned": false, 00:10:49.423 "supported_io_types": { 00:10:49.423 "read": true, 00:10:49.423 "write": true, 00:10:49.423 "unmap": true, 00:10:49.423 "write_zeroes": true, 00:10:49.423 "flush": true, 00:10:49.423 "reset": true, 00:10:49.423 "compare": false, 00:10:49.423 "compare_and_write": false, 00:10:49.423 "abort": true, 00:10:49.423 "nvme_admin": false, 00:10:49.423 "nvme_io": false 00:10:49.423 }, 00:10:49.423 "memory_domains": [ 00:10:49.423 { 00:10:49.423 "dma_device_id": "system", 00:10:49.423 "dma_device_type": 1 00:10:49.423 }, 00:10:49.423 { 00:10:49.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.423 "dma_device_type": 2 00:10:49.423 } 00:10:49.423 ], 00:10:49.423 "driver_specific": {} 00:10:49.423 }' 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:49.423 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:49.680 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:49.938 "name": "BaseBdev2", 00:10:49.938 "aliases": [ 00:10:49.938 "78588570-890c-40aa-b62d-b861f6f90c95" 00:10:49.938 ], 00:10:49.938 "product_name": "Malloc disk", 00:10:49.938 "block_size": 512, 00:10:49.938 "num_blocks": 65536, 00:10:49.938 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:49.938 "assigned_rate_limits": { 00:10:49.938 "rw_ios_per_sec": 0, 00:10:49.938 "rw_mbytes_per_sec": 0, 00:10:49.938 "r_mbytes_per_sec": 0, 00:10:49.938 "w_mbytes_per_sec": 0 00:10:49.938 }, 00:10:49.938 "claimed": true, 00:10:49.938 "claim_type": "exclusive_write", 00:10:49.938 "zoned": false, 00:10:49.938 "supported_io_types": { 00:10:49.938 "read": true, 00:10:49.938 "write": true, 00:10:49.938 "unmap": true, 00:10:49.938 "write_zeroes": true, 00:10:49.938 "flush": true, 00:10:49.938 "reset": true, 00:10:49.938 "compare": false, 00:10:49.938 "compare_and_write": false, 00:10:49.938 "abort": true, 00:10:49.938 "nvme_admin": false, 00:10:49.938 "nvme_io": false 00:10:49.938 }, 00:10:49.938 "memory_domains": [ 00:10:49.938 { 00:10:49.938 "dma_device_id": "system", 00:10:49.938 "dma_device_type": 1 00:10:49.938 }, 00:10:49.938 { 00:10:49.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.938 "dma_device_type": 2 00:10:49.938 } 00:10:49.938 ], 00:10:49.938 "driver_specific": {} 00:10:49.938 }' 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:49.938 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:50.196 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.196 04:11:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:50.196 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:50.196 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:50.196 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:10:50.196 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:10:50.196 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:10:50.454 "name": "BaseBdev3", 00:10:50.454 "aliases": [ 00:10:50.454 "6876fd47-4bd9-448d-a6d1-bdfefe4de96a" 00:10:50.454 ], 00:10:50.454 "product_name": "Malloc disk", 00:10:50.454 "block_size": 512, 00:10:50.454 "num_blocks": 65536, 00:10:50.454 "uuid": "6876fd47-4bd9-448d-a6d1-bdfefe4de96a", 00:10:50.454 "assigned_rate_limits": { 00:10:50.454 "rw_ios_per_sec": 0, 00:10:50.454 "rw_mbytes_per_sec": 0, 00:10:50.454 "r_mbytes_per_sec": 0, 00:10:50.454 "w_mbytes_per_sec": 0 00:10:50.454 }, 00:10:50.454 "claimed": true, 00:10:50.454 "claim_type": "exclusive_write", 00:10:50.454 "zoned": false, 00:10:50.454 "supported_io_types": { 00:10:50.454 "read": true, 00:10:50.454 "write": true, 00:10:50.454 "unmap": true, 00:10:50.454 "write_zeroes": true, 00:10:50.454 "flush": true, 00:10:50.454 "reset": true, 00:10:50.454 "compare": false, 00:10:50.454 "compare_and_write": false, 00:10:50.454 "abort": true, 00:10:50.454 "nvme_admin": false, 00:10:50.454 "nvme_io": false 00:10:50.454 }, 00:10:50.454 "memory_domains": [ 00:10:50.454 { 00:10:50.454 "dma_device_id": "system", 00:10:50.454 "dma_device_type": 1 00:10:50.454 }, 00:10:50.454 { 00:10:50.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.454 "dma_device_type": 2 00:10:50.454 } 00:10:50.454 ], 00:10:50.454 "driver_specific": {} 00:10:50.454 }' 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:50.454 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:10:50.455 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.455 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:50.455 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:10:50.712 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.712 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:50.712 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:10:50.712 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:10:50.712 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:50.969 [2024-05-15 04:11:38.828383] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:50.969 [2024-05-15 04:11:38.828405] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:50.969 [2024-05-15 04:11:38.828448] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:50.969 04:11:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:51.227 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:51.227 "name": "Existed_Raid", 00:10:51.227 "uuid": "3dfc6bd4-c64f-472f-a54a-c8deb9201886", 00:10:51.227 "strip_size_kb": 64, 00:10:51.227 "state": "offline", 00:10:51.227 "raid_level": "raid0", 00:10:51.227 "superblock": true, 00:10:51.227 "num_base_bdevs": 3, 00:10:51.227 "num_base_bdevs_discovered": 2, 00:10:51.227 "num_base_bdevs_operational": 2, 00:10:51.227 "base_bdevs_list": [ 00:10:51.227 { 00:10:51.227 "name": null, 00:10:51.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:51.227 "is_configured": false, 00:10:51.227 "data_offset": 2048, 00:10:51.227 "data_size": 63488 00:10:51.227 }, 00:10:51.227 { 00:10:51.227 "name": "BaseBdev2", 00:10:51.227 "uuid": "78588570-890c-40aa-b62d-b861f6f90c95", 00:10:51.227 "is_configured": true, 00:10:51.227 "data_offset": 2048, 00:10:51.227 "data_size": 63488 00:10:51.227 }, 00:10:51.227 { 00:10:51.227 "name": "BaseBdev3", 00:10:51.227 "uuid": "6876fd47-4bd9-448d-a6d1-bdfefe4de96a", 00:10:51.227 "is_configured": true, 00:10:51.227 "data_offset": 2048, 00:10:51.227 "data_size": 63488 00:10:51.227 } 00:10:51.227 ] 00:10:51.227 }' 00:10:51.227 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:51.227 04:11:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:51.796 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:10:51.796 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:51.796 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:51.796 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:52.053 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:52.053 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:52.053 04:11:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:52.311 [2024-05-15 04:11:40.169670] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:52.311 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:52.311 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:52.311 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.311 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:10:52.570 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:10:52.570 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:52.570 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:10:52.828 [2024-05-15 04:11:40.688412] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:10:52.828 [2024-05-15 04:11:40.688470] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ea57e0 name Existed_Raid, state offline 00:10:52.828 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:10:52.828 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:10:52.828 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:52.828 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:53.085 04:11:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:53.343 BaseBdev2 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:53.343 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:53.600 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:53.858 [ 00:10:53.858 { 00:10:53.858 "name": "BaseBdev2", 00:10:53.858 "aliases": [ 00:10:53.858 "0444dc01-daf2-4e99-a5ca-da74def3d085" 00:10:53.858 ], 00:10:53.858 "product_name": "Malloc disk", 00:10:53.858 "block_size": 512, 00:10:53.858 "num_blocks": 65536, 00:10:53.858 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:10:53.858 "assigned_rate_limits": { 00:10:53.858 "rw_ios_per_sec": 0, 00:10:53.858 "rw_mbytes_per_sec": 0, 00:10:53.858 "r_mbytes_per_sec": 0, 00:10:53.858 "w_mbytes_per_sec": 0 00:10:53.858 }, 00:10:53.858 "claimed": false, 00:10:53.858 "zoned": false, 00:10:53.858 "supported_io_types": { 00:10:53.858 "read": true, 00:10:53.858 "write": true, 00:10:53.858 "unmap": true, 00:10:53.858 "write_zeroes": true, 00:10:53.858 "flush": true, 00:10:53.858 "reset": true, 00:10:53.858 "compare": false, 00:10:53.858 "compare_and_write": false, 00:10:53.858 "abort": true, 00:10:53.858 "nvme_admin": false, 00:10:53.858 "nvme_io": false 00:10:53.858 }, 00:10:53.858 "memory_domains": [ 00:10:53.858 { 00:10:53.858 "dma_device_id": "system", 00:10:53.858 "dma_device_type": 1 00:10:53.858 }, 00:10:53.858 { 00:10:53.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:53.858 "dma_device_type": 2 00:10:53.858 } 00:10:53.858 ], 00:10:53.858 "driver_specific": {} 00:10:53.858 } 00:10:53.858 ] 00:10:53.858 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:53.858 04:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:10:53.858 04:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:53.858 04:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:10:54.116 BaseBdev3 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:54.116 04:11:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:54.374 04:11:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:10:54.632 [ 00:10:54.632 { 00:10:54.632 "name": "BaseBdev3", 00:10:54.632 "aliases": [ 00:10:54.632 "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73" 00:10:54.632 ], 00:10:54.632 "product_name": "Malloc disk", 00:10:54.632 "block_size": 512, 00:10:54.632 "num_blocks": 65536, 00:10:54.632 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:10:54.632 "assigned_rate_limits": { 00:10:54.632 "rw_ios_per_sec": 0, 00:10:54.632 "rw_mbytes_per_sec": 0, 00:10:54.632 "r_mbytes_per_sec": 0, 00:10:54.632 "w_mbytes_per_sec": 0 00:10:54.632 }, 00:10:54.632 "claimed": false, 00:10:54.632 "zoned": false, 00:10:54.632 "supported_io_types": { 00:10:54.632 "read": true, 00:10:54.632 "write": true, 00:10:54.632 "unmap": true, 00:10:54.632 "write_zeroes": true, 00:10:54.632 "flush": true, 00:10:54.632 "reset": true, 00:10:54.632 "compare": false, 00:10:54.632 "compare_and_write": false, 00:10:54.632 "abort": true, 00:10:54.632 "nvme_admin": false, 00:10:54.632 "nvme_io": false 00:10:54.632 }, 00:10:54.632 "memory_domains": [ 00:10:54.632 { 00:10:54.632 "dma_device_id": "system", 00:10:54.632 "dma_device_type": 1 00:10:54.632 }, 00:10:54.632 { 00:10:54.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:54.632 "dma_device_type": 2 00:10:54.632 } 00:10:54.632 ], 00:10:54.632 "driver_specific": {} 00:10:54.632 } 00:10:54.632 ] 00:10:54.632 04:11:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:54.632 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:10:54.632 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:10:54.632 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:10:54.890 [2024-05-15 04:11:42.773407] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:54.890 [2024-05-15 04:11:42.773443] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:54.890 [2024-05-15 04:11:42.773474] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:54.890 [2024-05-15 04:11:42.774777] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:54.890 04:11:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:55.148 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:55.148 "name": "Existed_Raid", 00:10:55.148 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:10:55.148 "strip_size_kb": 64, 00:10:55.148 "state": "configuring", 00:10:55.148 "raid_level": "raid0", 00:10:55.148 "superblock": true, 00:10:55.148 "num_base_bdevs": 3, 00:10:55.148 "num_base_bdevs_discovered": 2, 00:10:55.148 "num_base_bdevs_operational": 3, 00:10:55.148 "base_bdevs_list": [ 00:10:55.148 { 00:10:55.148 "name": "BaseBdev1", 00:10:55.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:55.148 "is_configured": false, 00:10:55.148 "data_offset": 0, 00:10:55.148 "data_size": 0 00:10:55.148 }, 00:10:55.148 { 00:10:55.148 "name": "BaseBdev2", 00:10:55.148 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:10:55.148 "is_configured": true, 00:10:55.148 "data_offset": 2048, 00:10:55.148 "data_size": 63488 00:10:55.148 }, 00:10:55.148 { 00:10:55.148 "name": "BaseBdev3", 00:10:55.148 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:10:55.148 "is_configured": true, 00:10:55.148 "data_offset": 2048, 00:10:55.148 "data_size": 63488 00:10:55.148 } 00:10:55.148 ] 00:10:55.148 }' 00:10:55.148 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:55.148 04:11:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:55.712 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:10:55.970 [2024-05-15 04:11:43.816141] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:55.970 04:11:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.227 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:56.227 "name": "Existed_Raid", 00:10:56.227 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:10:56.227 "strip_size_kb": 64, 00:10:56.227 "state": "configuring", 00:10:56.227 "raid_level": "raid0", 00:10:56.227 "superblock": true, 00:10:56.227 "num_base_bdevs": 3, 00:10:56.227 "num_base_bdevs_discovered": 1, 00:10:56.227 "num_base_bdevs_operational": 3, 00:10:56.227 "base_bdevs_list": [ 00:10:56.227 { 00:10:56.227 "name": "BaseBdev1", 00:10:56.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:56.227 "is_configured": false, 00:10:56.227 "data_offset": 0, 00:10:56.227 "data_size": 0 00:10:56.227 }, 00:10:56.227 { 00:10:56.227 "name": null, 00:10:56.227 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:10:56.227 "is_configured": false, 00:10:56.227 "data_offset": 2048, 00:10:56.227 "data_size": 63488 00:10:56.227 }, 00:10:56.227 { 00:10:56.227 "name": "BaseBdev3", 00:10:56.227 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:10:56.227 "is_configured": true, 00:10:56.228 "data_offset": 2048, 00:10:56.228 "data_size": 63488 00:10:56.228 } 00:10:56.228 ] 00:10:56.228 }' 00:10:56.228 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:56.228 04:11:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:56.793 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.793 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:10:57.051 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:10:57.051 04:11:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:57.310 [2024-05-15 04:11:45.095853] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:57.310 BaseBdev1 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:10:57.310 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:57.569 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:57.569 [ 00:10:57.569 { 00:10:57.569 "name": "BaseBdev1", 00:10:57.569 "aliases": [ 00:10:57.569 "6880181b-efdf-4909-a22e-eab30a56e0b4" 00:10:57.569 ], 00:10:57.569 "product_name": "Malloc disk", 00:10:57.569 "block_size": 512, 00:10:57.569 "num_blocks": 65536, 00:10:57.569 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:10:57.569 "assigned_rate_limits": { 00:10:57.569 "rw_ios_per_sec": 0, 00:10:57.569 "rw_mbytes_per_sec": 0, 00:10:57.569 "r_mbytes_per_sec": 0, 00:10:57.569 "w_mbytes_per_sec": 0 00:10:57.569 }, 00:10:57.569 "claimed": true, 00:10:57.569 "claim_type": "exclusive_write", 00:10:57.569 "zoned": false, 00:10:57.569 "supported_io_types": { 00:10:57.569 "read": true, 00:10:57.569 "write": true, 00:10:57.569 "unmap": true, 00:10:57.569 "write_zeroes": true, 00:10:57.569 "flush": true, 00:10:57.569 "reset": true, 00:10:57.569 "compare": false, 00:10:57.569 "compare_and_write": false, 00:10:57.569 "abort": true, 00:10:57.569 "nvme_admin": false, 00:10:57.569 "nvme_io": false 00:10:57.569 }, 00:10:57.569 "memory_domains": [ 00:10:57.569 { 00:10:57.569 "dma_device_id": "system", 00:10:57.569 "dma_device_type": 1 00:10:57.569 }, 00:10:57.569 { 00:10:57.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:57.569 "dma_device_type": 2 00:10:57.569 } 00:10:57.569 ], 00:10:57.569 "driver_specific": {} 00:10:57.569 } 00:10:57.569 ] 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:57.828 "name": "Existed_Raid", 00:10:57.828 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:10:57.828 "strip_size_kb": 64, 00:10:57.828 "state": "configuring", 00:10:57.828 "raid_level": "raid0", 00:10:57.828 "superblock": true, 00:10:57.828 "num_base_bdevs": 3, 00:10:57.828 "num_base_bdevs_discovered": 2, 00:10:57.828 "num_base_bdevs_operational": 3, 00:10:57.828 "base_bdevs_list": [ 00:10:57.828 { 00:10:57.828 "name": "BaseBdev1", 00:10:57.828 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:10:57.828 "is_configured": true, 00:10:57.828 "data_offset": 2048, 00:10:57.828 "data_size": 63488 00:10:57.828 }, 00:10:57.828 { 00:10:57.828 "name": null, 00:10:57.828 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:10:57.828 "is_configured": false, 00:10:57.828 "data_offset": 2048, 00:10:57.828 "data_size": 63488 00:10:57.828 }, 00:10:57.828 { 00:10:57.828 "name": "BaseBdev3", 00:10:57.828 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:10:57.828 "is_configured": true, 00:10:57.828 "data_offset": 2048, 00:10:57.828 "data_size": 63488 00:10:57.828 } 00:10:57.828 ] 00:10:57.828 }' 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:57.828 04:11:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:58.394 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.394 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:10:58.652 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:10:58.652 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:10:58.911 [2024-05-15 04:11:46.824528] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:58.911 04:11:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:59.169 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:10:59.169 "name": "Existed_Raid", 00:10:59.169 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:10:59.169 "strip_size_kb": 64, 00:10:59.169 "state": "configuring", 00:10:59.169 "raid_level": "raid0", 00:10:59.169 "superblock": true, 00:10:59.169 "num_base_bdevs": 3, 00:10:59.169 "num_base_bdevs_discovered": 1, 00:10:59.169 "num_base_bdevs_operational": 3, 00:10:59.169 "base_bdevs_list": [ 00:10:59.169 { 00:10:59.169 "name": "BaseBdev1", 00:10:59.169 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:10:59.169 "is_configured": true, 00:10:59.169 "data_offset": 2048, 00:10:59.169 "data_size": 63488 00:10:59.169 }, 00:10:59.169 { 00:10:59.169 "name": null, 00:10:59.169 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:10:59.169 "is_configured": false, 00:10:59.169 "data_offset": 2048, 00:10:59.169 "data_size": 63488 00:10:59.169 }, 00:10:59.169 { 00:10:59.169 "name": null, 00:10:59.169 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:10:59.169 "is_configured": false, 00:10:59.169 "data_offset": 2048, 00:10:59.169 "data_size": 63488 00:10:59.169 } 00:10:59.169 ] 00:10:59.169 }' 00:10:59.169 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:10:59.169 04:11:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:59.734 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:59.734 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:10:59.992 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:10:59.992 04:11:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:00.249 [2024-05-15 04:11:48.095957] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.249 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:00.507 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:00.507 "name": "Existed_Raid", 00:11:00.507 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:00.507 "strip_size_kb": 64, 00:11:00.507 "state": "configuring", 00:11:00.507 "raid_level": "raid0", 00:11:00.507 "superblock": true, 00:11:00.507 "num_base_bdevs": 3, 00:11:00.507 "num_base_bdevs_discovered": 2, 00:11:00.507 "num_base_bdevs_operational": 3, 00:11:00.507 "base_bdevs_list": [ 00:11:00.507 { 00:11:00.507 "name": "BaseBdev1", 00:11:00.507 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:00.507 "is_configured": true, 00:11:00.507 "data_offset": 2048, 00:11:00.507 "data_size": 63488 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "name": null, 00:11:00.507 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:00.507 "is_configured": false, 00:11:00.507 "data_offset": 2048, 00:11:00.507 "data_size": 63488 00:11:00.507 }, 00:11:00.507 { 00:11:00.507 "name": "BaseBdev3", 00:11:00.507 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:00.507 "is_configured": true, 00:11:00.507 "data_offset": 2048, 00:11:00.507 "data_size": 63488 00:11:00.507 } 00:11:00.507 ] 00:11:00.507 }' 00:11:00.507 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:00.507 04:11:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:01.128 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.128 04:11:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:01.387 [2024-05-15 04:11:49.359367] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:01.387 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:01.645 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:01.645 "name": "Existed_Raid", 00:11:01.645 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:01.645 "strip_size_kb": 64, 00:11:01.645 "state": "configuring", 00:11:01.645 "raid_level": "raid0", 00:11:01.645 "superblock": true, 00:11:01.645 "num_base_bdevs": 3, 00:11:01.645 "num_base_bdevs_discovered": 1, 00:11:01.645 "num_base_bdevs_operational": 3, 00:11:01.645 "base_bdevs_list": [ 00:11:01.645 { 00:11:01.645 "name": null, 00:11:01.645 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:01.645 "is_configured": false, 00:11:01.645 "data_offset": 2048, 00:11:01.645 "data_size": 63488 00:11:01.645 }, 00:11:01.645 { 00:11:01.645 "name": null, 00:11:01.645 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:01.645 "is_configured": false, 00:11:01.645 "data_offset": 2048, 00:11:01.645 "data_size": 63488 00:11:01.645 }, 00:11:01.645 { 00:11:01.645 "name": "BaseBdev3", 00:11:01.645 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:01.645 "is_configured": true, 00:11:01.645 "data_offset": 2048, 00:11:01.645 "data_size": 63488 00:11:01.645 } 00:11:01.645 ] 00:11:01.645 }' 00:11:01.645 04:11:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:01.645 04:11:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:02.211 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.211 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:02.468 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:11:02.468 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:02.727 [2024-05-15 04:11:50.682381] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.727 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:02.985 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:02.985 "name": "Existed_Raid", 00:11:02.985 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:02.985 "strip_size_kb": 64, 00:11:02.985 "state": "configuring", 00:11:02.985 "raid_level": "raid0", 00:11:02.985 "superblock": true, 00:11:02.985 "num_base_bdevs": 3, 00:11:02.985 "num_base_bdevs_discovered": 2, 00:11:02.985 "num_base_bdevs_operational": 3, 00:11:02.985 "base_bdevs_list": [ 00:11:02.985 { 00:11:02.985 "name": null, 00:11:02.985 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:02.985 "is_configured": false, 00:11:02.985 "data_offset": 2048, 00:11:02.985 "data_size": 63488 00:11:02.985 }, 00:11:02.985 { 00:11:02.985 "name": "BaseBdev2", 00:11:02.985 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:02.985 "is_configured": true, 00:11:02.985 "data_offset": 2048, 00:11:02.985 "data_size": 63488 00:11:02.985 }, 00:11:02.985 { 00:11:02.985 "name": "BaseBdev3", 00:11:02.985 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:02.985 "is_configured": true, 00:11:02.985 "data_offset": 2048, 00:11:02.985 "data_size": 63488 00:11:02.985 } 00:11:02.985 ] 00:11:02.985 }' 00:11:02.985 04:11:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:02.985 04:11:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:03.550 04:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.550 04:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:03.807 04:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:11:03.807 04:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:03.807 04:11:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:04.065 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6880181b-efdf-4909-a22e-eab30a56e0b4 00:11:04.323 [2024-05-15 04:11:52.240607] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:04.323 [2024-05-15 04:11:52.240835] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x20494a0 00:11:04.323 [2024-05-15 04:11:52.240883] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:04.323 [2024-05-15 04:11:52.241037] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ea42e0 00:11:04.323 [2024-05-15 04:11:52.241175] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20494a0 00:11:04.323 [2024-05-15 04:11:52.241188] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20494a0 00:11:04.323 [2024-05-15 04:11:52.241283] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.323 NewBaseBdev 00:11:04.323 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:04.324 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:04.582 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:04.840 [ 00:11:04.840 { 00:11:04.840 "name": "NewBaseBdev", 00:11:04.840 "aliases": [ 00:11:04.840 "6880181b-efdf-4909-a22e-eab30a56e0b4" 00:11:04.840 ], 00:11:04.840 "product_name": "Malloc disk", 00:11:04.840 "block_size": 512, 00:11:04.840 "num_blocks": 65536, 00:11:04.840 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:04.840 "assigned_rate_limits": { 00:11:04.840 "rw_ios_per_sec": 0, 00:11:04.840 "rw_mbytes_per_sec": 0, 00:11:04.840 "r_mbytes_per_sec": 0, 00:11:04.840 "w_mbytes_per_sec": 0 00:11:04.840 }, 00:11:04.840 "claimed": true, 00:11:04.840 "claim_type": "exclusive_write", 00:11:04.840 "zoned": false, 00:11:04.840 "supported_io_types": { 00:11:04.840 "read": true, 00:11:04.840 "write": true, 00:11:04.840 "unmap": true, 00:11:04.840 "write_zeroes": true, 00:11:04.840 "flush": true, 00:11:04.840 "reset": true, 00:11:04.840 "compare": false, 00:11:04.840 "compare_and_write": false, 00:11:04.840 "abort": true, 00:11:04.840 "nvme_admin": false, 00:11:04.840 "nvme_io": false 00:11:04.840 }, 00:11:04.840 "memory_domains": [ 00:11:04.840 { 00:11:04.840 "dma_device_id": "system", 00:11:04.840 "dma_device_type": 1 00:11:04.840 }, 00:11:04.840 { 00:11:04.840 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:04.840 "dma_device_type": 2 00:11:04.840 } 00:11:04.840 ], 00:11:04.840 "driver_specific": {} 00:11:04.840 } 00:11:04.840 ] 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:04.840 04:11:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.098 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:05.098 "name": "Existed_Raid", 00:11:05.098 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:05.098 "strip_size_kb": 64, 00:11:05.098 "state": "online", 00:11:05.098 "raid_level": "raid0", 00:11:05.098 "superblock": true, 00:11:05.098 "num_base_bdevs": 3, 00:11:05.098 "num_base_bdevs_discovered": 3, 00:11:05.098 "num_base_bdevs_operational": 3, 00:11:05.098 "base_bdevs_list": [ 00:11:05.098 { 00:11:05.098 "name": "NewBaseBdev", 00:11:05.098 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:05.098 "is_configured": true, 00:11:05.098 "data_offset": 2048, 00:11:05.098 "data_size": 63488 00:11:05.098 }, 00:11:05.098 { 00:11:05.098 "name": "BaseBdev2", 00:11:05.098 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:05.098 "is_configured": true, 00:11:05.098 "data_offset": 2048, 00:11:05.098 "data_size": 63488 00:11:05.098 }, 00:11:05.098 { 00:11:05.098 "name": "BaseBdev3", 00:11:05.098 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:05.098 "is_configured": true, 00:11:05.098 "data_offset": 2048, 00:11:05.098 "data_size": 63488 00:11:05.098 } 00:11:05.098 ] 00:11:05.098 }' 00:11:05.098 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:05.098 04:11:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:05.663 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:11:05.664 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:05.664 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:05.922 [2024-05-15 04:11:53.780904] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:05.922 "name": "Existed_Raid", 00:11:05.922 "aliases": [ 00:11:05.922 "b18acf29-7131-406c-926a-fe8bcc5fd8f2" 00:11:05.922 ], 00:11:05.922 "product_name": "Raid Volume", 00:11:05.922 "block_size": 512, 00:11:05.922 "num_blocks": 190464, 00:11:05.922 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:05.922 "assigned_rate_limits": { 00:11:05.922 "rw_ios_per_sec": 0, 00:11:05.922 "rw_mbytes_per_sec": 0, 00:11:05.922 "r_mbytes_per_sec": 0, 00:11:05.922 "w_mbytes_per_sec": 0 00:11:05.922 }, 00:11:05.922 "claimed": false, 00:11:05.922 "zoned": false, 00:11:05.922 "supported_io_types": { 00:11:05.922 "read": true, 00:11:05.922 "write": true, 00:11:05.922 "unmap": true, 00:11:05.922 "write_zeroes": true, 00:11:05.922 "flush": true, 00:11:05.922 "reset": true, 00:11:05.922 "compare": false, 00:11:05.922 "compare_and_write": false, 00:11:05.922 "abort": false, 00:11:05.922 "nvme_admin": false, 00:11:05.922 "nvme_io": false 00:11:05.922 }, 00:11:05.922 "memory_domains": [ 00:11:05.922 { 00:11:05.922 "dma_device_id": "system", 00:11:05.922 "dma_device_type": 1 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.922 "dma_device_type": 2 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "dma_device_id": "system", 00:11:05.922 "dma_device_type": 1 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.922 "dma_device_type": 2 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "dma_device_id": "system", 00:11:05.922 "dma_device_type": 1 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:05.922 "dma_device_type": 2 00:11:05.922 } 00:11:05.922 ], 00:11:05.922 "driver_specific": { 00:11:05.922 "raid": { 00:11:05.922 "uuid": "b18acf29-7131-406c-926a-fe8bcc5fd8f2", 00:11:05.922 "strip_size_kb": 64, 00:11:05.922 "state": "online", 00:11:05.922 "raid_level": "raid0", 00:11:05.922 "superblock": true, 00:11:05.922 "num_base_bdevs": 3, 00:11:05.922 "num_base_bdevs_discovered": 3, 00:11:05.922 "num_base_bdevs_operational": 3, 00:11:05.922 "base_bdevs_list": [ 00:11:05.922 { 00:11:05.922 "name": "NewBaseBdev", 00:11:05.922 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:05.922 "is_configured": true, 00:11:05.922 "data_offset": 2048, 00:11:05.922 "data_size": 63488 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "name": "BaseBdev2", 00:11:05.922 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:05.922 "is_configured": true, 00:11:05.922 "data_offset": 2048, 00:11:05.922 "data_size": 63488 00:11:05.922 }, 00:11:05.922 { 00:11:05.922 "name": "BaseBdev3", 00:11:05.922 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:05.922 "is_configured": true, 00:11:05.922 "data_offset": 2048, 00:11:05.922 "data_size": 63488 00:11:05.922 } 00:11:05.922 ] 00:11:05.922 } 00:11:05.922 } 00:11:05.922 }' 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:11:05.922 BaseBdev2 00:11:05.922 BaseBdev3' 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:05.922 04:11:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:06.180 "name": "NewBaseBdev", 00:11:06.180 "aliases": [ 00:11:06.180 "6880181b-efdf-4909-a22e-eab30a56e0b4" 00:11:06.180 ], 00:11:06.180 "product_name": "Malloc disk", 00:11:06.180 "block_size": 512, 00:11:06.180 "num_blocks": 65536, 00:11:06.180 "uuid": "6880181b-efdf-4909-a22e-eab30a56e0b4", 00:11:06.180 "assigned_rate_limits": { 00:11:06.180 "rw_ios_per_sec": 0, 00:11:06.180 "rw_mbytes_per_sec": 0, 00:11:06.180 "r_mbytes_per_sec": 0, 00:11:06.180 "w_mbytes_per_sec": 0 00:11:06.180 }, 00:11:06.180 "claimed": true, 00:11:06.180 "claim_type": "exclusive_write", 00:11:06.180 "zoned": false, 00:11:06.180 "supported_io_types": { 00:11:06.180 "read": true, 00:11:06.180 "write": true, 00:11:06.180 "unmap": true, 00:11:06.180 "write_zeroes": true, 00:11:06.180 "flush": true, 00:11:06.180 "reset": true, 00:11:06.180 "compare": false, 00:11:06.180 "compare_and_write": false, 00:11:06.180 "abort": true, 00:11:06.180 "nvme_admin": false, 00:11:06.180 "nvme_io": false 00:11:06.180 }, 00:11:06.180 "memory_domains": [ 00:11:06.180 { 00:11:06.180 "dma_device_id": "system", 00:11:06.180 "dma_device_type": 1 00:11:06.180 }, 00:11:06.180 { 00:11:06.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.180 "dma_device_type": 2 00:11:06.180 } 00:11:06.180 ], 00:11:06.180 "driver_specific": {} 00:11:06.180 }' 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.180 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:06.438 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:06.696 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:06.696 "name": "BaseBdev2", 00:11:06.696 "aliases": [ 00:11:06.696 "0444dc01-daf2-4e99-a5ca-da74def3d085" 00:11:06.696 ], 00:11:06.697 "product_name": "Malloc disk", 00:11:06.697 "block_size": 512, 00:11:06.697 "num_blocks": 65536, 00:11:06.697 "uuid": "0444dc01-daf2-4e99-a5ca-da74def3d085", 00:11:06.697 "assigned_rate_limits": { 00:11:06.697 "rw_ios_per_sec": 0, 00:11:06.697 "rw_mbytes_per_sec": 0, 00:11:06.697 "r_mbytes_per_sec": 0, 00:11:06.697 "w_mbytes_per_sec": 0 00:11:06.697 }, 00:11:06.697 "claimed": true, 00:11:06.697 "claim_type": "exclusive_write", 00:11:06.697 "zoned": false, 00:11:06.697 "supported_io_types": { 00:11:06.697 "read": true, 00:11:06.697 "write": true, 00:11:06.697 "unmap": true, 00:11:06.697 "write_zeroes": true, 00:11:06.697 "flush": true, 00:11:06.697 "reset": true, 00:11:06.697 "compare": false, 00:11:06.697 "compare_and_write": false, 00:11:06.697 "abort": true, 00:11:06.697 "nvme_admin": false, 00:11:06.697 "nvme_io": false 00:11:06.697 }, 00:11:06.697 "memory_domains": [ 00:11:06.697 { 00:11:06.697 "dma_device_id": "system", 00:11:06.697 "dma_device_type": 1 00:11:06.697 }, 00:11:06.697 { 00:11:06.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:06.697 "dma_device_type": 2 00:11:06.697 } 00:11:06.697 ], 00:11:06.697 "driver_specific": {} 00:11:06.697 }' 00:11:06.697 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.697 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:06.697 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:06.697 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.697 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:06.954 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:06.955 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:06.955 04:11:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:07.213 "name": "BaseBdev3", 00:11:07.213 "aliases": [ 00:11:07.213 "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73" 00:11:07.213 ], 00:11:07.213 "product_name": "Malloc disk", 00:11:07.213 "block_size": 512, 00:11:07.213 "num_blocks": 65536, 00:11:07.213 "uuid": "bb6d71ba-9e9c-425f-8419-d6d4fb14ed73", 00:11:07.213 "assigned_rate_limits": { 00:11:07.213 "rw_ios_per_sec": 0, 00:11:07.213 "rw_mbytes_per_sec": 0, 00:11:07.213 "r_mbytes_per_sec": 0, 00:11:07.213 "w_mbytes_per_sec": 0 00:11:07.213 }, 00:11:07.213 "claimed": true, 00:11:07.213 "claim_type": "exclusive_write", 00:11:07.213 "zoned": false, 00:11:07.213 "supported_io_types": { 00:11:07.213 "read": true, 00:11:07.213 "write": true, 00:11:07.213 "unmap": true, 00:11:07.213 "write_zeroes": true, 00:11:07.213 "flush": true, 00:11:07.213 "reset": true, 00:11:07.213 "compare": false, 00:11:07.213 "compare_and_write": false, 00:11:07.213 "abort": true, 00:11:07.213 "nvme_admin": false, 00:11:07.213 "nvme_io": false 00:11:07.213 }, 00:11:07.213 "memory_domains": [ 00:11:07.213 { 00:11:07.213 "dma_device_id": "system", 00:11:07.213 "dma_device_type": 1 00:11:07.213 }, 00:11:07.213 { 00:11:07.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.213 "dma_device_type": 2 00:11:07.213 } 00:11:07.213 ], 00:11:07.213 "driver_specific": {} 00:11:07.213 }' 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.213 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:07.470 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:07.728 [2024-05-15 04:11:55.633642] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:07.728 [2024-05-15 04:11:55.633664] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:07.728 [2024-05-15 04:11:55.633723] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:07.728 [2024-05-15 04:11:55.633778] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:07.728 [2024-05-15 04:11:55.633790] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20494a0 name Existed_Raid, state offline 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3842228 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3842228 ']' 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3842228 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3842228 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3842228' 00:11:07.728 killing process with pid 3842228 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3842228 00:11:07.728 [2024-05-15 04:11:55.683228] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:07.728 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3842228 00:11:07.728 [2024-05-15 04:11:55.717712] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:07.985 04:11:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:11:07.985 00:11:07.985 real 0m27.230s 00:11:07.985 user 0m51.282s 00:11:07.985 sys 0m3.659s 00:11:07.985 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:07.985 04:11:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:07.985 ************************************ 00:11:07.985 END TEST raid_state_function_test_sb 00:11:07.985 ************************************ 00:11:07.985 04:11:55 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:11:07.985 04:11:55 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:11:07.985 04:11:55 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:08.243 04:11:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:08.243 ************************************ 00:11:08.243 START TEST raid_superblock_test 00:11:08.243 ************************************ 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 3 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:11:08.243 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3846035 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3846035 /var/tmp/spdk-raid.sock 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3846035 ']' 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:08.244 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:08.244 04:11:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.244 [2024-05-15 04:11:56.086976] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:11:08.244 [2024-05-15 04:11:56.087060] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3846035 ] 00:11:08.244 [2024-05-15 04:11:56.169148] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.502 [2024-05-15 04:11:56.285248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.502 [2024-05-15 04:11:56.355230] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.502 [2024-05-15 04:11:56.355283] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:09.067 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:09.633 malloc1 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:09.633 [2024-05-15 04:11:57.615945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:09.633 [2024-05-15 04:11:57.616021] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:09.633 [2024-05-15 04:11:57.616050] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252bc20 00:11:09.633 [2024-05-15 04:11:57.616075] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:09.633 [2024-05-15 04:11:57.617801] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:09.633 [2024-05-15 04:11:57.617850] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:09.633 pt1 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:09.633 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:09.891 malloc2 00:11:09.892 04:11:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:10.149 [2024-05-15 04:11:58.108706] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:10.149 [2024-05-15 04:11:58.108776] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:10.149 [2024-05-15 04:11:58.108801] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2523c00 00:11:10.149 [2024-05-15 04:11:58.108816] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:10.149 [2024-05-15 04:11:58.110528] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:10.149 [2024-05-15 04:11:58.110556] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:10.149 pt2 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:10.149 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:11:10.407 malloc3 00:11:10.665 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:10.665 [2024-05-15 04:11:58.653710] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:10.665 [2024-05-15 04:11:58.653780] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:10.665 [2024-05-15 04:11:58.653842] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d49c0 00:11:10.665 [2024-05-15 04:11:58.653860] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:10.665 [2024-05-15 04:11:58.655354] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:10.665 [2024-05-15 04:11:58.655377] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:10.665 pt3 00:11:10.666 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:11:10.666 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:11:10.666 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:11:10.924 [2024-05-15 04:11:58.914466] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:10.924 [2024-05-15 04:11:58.915891] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:10.924 [2024-05-15 04:11:58.915955] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:10.924 [2024-05-15 04:11:58.916162] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x25278e0 00:11:10.924 [2024-05-15 04:11:58.916180] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:10.924 [2024-05-15 04:11:58.916417] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2526fa0 00:11:10.924 [2024-05-15 04:11:58.916593] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25278e0 00:11:10.924 [2024-05-15 04:11:58.916610] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25278e0 00:11:10.924 [2024-05-15 04:11:58.916745] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:10.924 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:11.182 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:11.182 04:11:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:11.182 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:11.182 "name": "raid_bdev1", 00:11:11.182 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:11.182 "strip_size_kb": 64, 00:11:11.182 "state": "online", 00:11:11.182 "raid_level": "raid0", 00:11:11.182 "superblock": true, 00:11:11.182 "num_base_bdevs": 3, 00:11:11.182 "num_base_bdevs_discovered": 3, 00:11:11.182 "num_base_bdevs_operational": 3, 00:11:11.182 "base_bdevs_list": [ 00:11:11.182 { 00:11:11.182 "name": "pt1", 00:11:11.182 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:11.182 "is_configured": true, 00:11:11.182 "data_offset": 2048, 00:11:11.182 "data_size": 63488 00:11:11.182 }, 00:11:11.182 { 00:11:11.182 "name": "pt2", 00:11:11.182 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:11.182 "is_configured": true, 00:11:11.182 "data_offset": 2048, 00:11:11.182 "data_size": 63488 00:11:11.182 }, 00:11:11.182 { 00:11:11.182 "name": "pt3", 00:11:11.182 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:11.182 "is_configured": true, 00:11:11.182 "data_offset": 2048, 00:11:11.182 "data_size": 63488 00:11:11.182 } 00:11:11.182 ] 00:11:11.182 }' 00:11:11.182 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:11.182 04:11:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:11.747 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:12.005 [2024-05-15 04:11:59.969427] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:12.005 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:12.005 "name": "raid_bdev1", 00:11:12.005 "aliases": [ 00:11:12.005 "0085a034-5f75-4564-97cc-e64299c9203a" 00:11:12.005 ], 00:11:12.005 "product_name": "Raid Volume", 00:11:12.005 "block_size": 512, 00:11:12.005 "num_blocks": 190464, 00:11:12.005 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:12.005 "assigned_rate_limits": { 00:11:12.005 "rw_ios_per_sec": 0, 00:11:12.005 "rw_mbytes_per_sec": 0, 00:11:12.005 "r_mbytes_per_sec": 0, 00:11:12.005 "w_mbytes_per_sec": 0 00:11:12.005 }, 00:11:12.005 "claimed": false, 00:11:12.005 "zoned": false, 00:11:12.005 "supported_io_types": { 00:11:12.005 "read": true, 00:11:12.005 "write": true, 00:11:12.005 "unmap": true, 00:11:12.005 "write_zeroes": true, 00:11:12.005 "flush": true, 00:11:12.005 "reset": true, 00:11:12.005 "compare": false, 00:11:12.005 "compare_and_write": false, 00:11:12.005 "abort": false, 00:11:12.005 "nvme_admin": false, 00:11:12.005 "nvme_io": false 00:11:12.006 }, 00:11:12.006 "memory_domains": [ 00:11:12.006 { 00:11:12.006 "dma_device_id": "system", 00:11:12.006 "dma_device_type": 1 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.006 "dma_device_type": 2 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "dma_device_id": "system", 00:11:12.006 "dma_device_type": 1 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.006 "dma_device_type": 2 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "dma_device_id": "system", 00:11:12.006 "dma_device_type": 1 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.006 "dma_device_type": 2 00:11:12.006 } 00:11:12.006 ], 00:11:12.006 "driver_specific": { 00:11:12.006 "raid": { 00:11:12.006 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:12.006 "strip_size_kb": 64, 00:11:12.006 "state": "online", 00:11:12.006 "raid_level": "raid0", 00:11:12.006 "superblock": true, 00:11:12.006 "num_base_bdevs": 3, 00:11:12.006 "num_base_bdevs_discovered": 3, 00:11:12.006 "num_base_bdevs_operational": 3, 00:11:12.006 "base_bdevs_list": [ 00:11:12.006 { 00:11:12.006 "name": "pt1", 00:11:12.006 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:12.006 "is_configured": true, 00:11:12.006 "data_offset": 2048, 00:11:12.006 "data_size": 63488 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "name": "pt2", 00:11:12.006 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:12.006 "is_configured": true, 00:11:12.006 "data_offset": 2048, 00:11:12.006 "data_size": 63488 00:11:12.006 }, 00:11:12.006 { 00:11:12.006 "name": "pt3", 00:11:12.006 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:12.006 "is_configured": true, 00:11:12.006 "data_offset": 2048, 00:11:12.006 "data_size": 63488 00:11:12.006 } 00:11:12.006 ] 00:11:12.006 } 00:11:12.006 } 00:11:12.006 }' 00:11:12.006 04:11:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:12.264 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:12.264 pt2 00:11:12.264 pt3' 00:11:12.264 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:12.264 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:12.264 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:12.522 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:12.522 "name": "pt1", 00:11:12.522 "aliases": [ 00:11:12.522 "71f23649-7b3c-5c40-9246-8c60de017bf6" 00:11:12.522 ], 00:11:12.522 "product_name": "passthru", 00:11:12.522 "block_size": 512, 00:11:12.522 "num_blocks": 65536, 00:11:12.522 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:12.522 "assigned_rate_limits": { 00:11:12.522 "rw_ios_per_sec": 0, 00:11:12.522 "rw_mbytes_per_sec": 0, 00:11:12.522 "r_mbytes_per_sec": 0, 00:11:12.523 "w_mbytes_per_sec": 0 00:11:12.523 }, 00:11:12.523 "claimed": true, 00:11:12.523 "claim_type": "exclusive_write", 00:11:12.523 "zoned": false, 00:11:12.523 "supported_io_types": { 00:11:12.523 "read": true, 00:11:12.523 "write": true, 00:11:12.523 "unmap": true, 00:11:12.523 "write_zeroes": true, 00:11:12.523 "flush": true, 00:11:12.523 "reset": true, 00:11:12.523 "compare": false, 00:11:12.523 "compare_and_write": false, 00:11:12.523 "abort": true, 00:11:12.523 "nvme_admin": false, 00:11:12.523 "nvme_io": false 00:11:12.523 }, 00:11:12.523 "memory_domains": [ 00:11:12.523 { 00:11:12.523 "dma_device_id": "system", 00:11:12.523 "dma_device_type": 1 00:11:12.523 }, 00:11:12.523 { 00:11:12.523 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.523 "dma_device_type": 2 00:11:12.523 } 00:11:12.523 ], 00:11:12.523 "driver_specific": { 00:11:12.523 "passthru": { 00:11:12.523 "name": "pt1", 00:11:12.523 "base_bdev_name": "malloc1" 00:11:12.523 } 00:11:12.523 } 00:11:12.523 }' 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:12.523 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:12.781 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:12.781 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:12.781 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:12.781 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:12.781 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:13.039 "name": "pt2", 00:11:13.039 "aliases": [ 00:11:13.039 "8d36772d-aed4-5991-b40e-6015db67a40f" 00:11:13.039 ], 00:11:13.039 "product_name": "passthru", 00:11:13.039 "block_size": 512, 00:11:13.039 "num_blocks": 65536, 00:11:13.039 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:13.039 "assigned_rate_limits": { 00:11:13.039 "rw_ios_per_sec": 0, 00:11:13.039 "rw_mbytes_per_sec": 0, 00:11:13.039 "r_mbytes_per_sec": 0, 00:11:13.039 "w_mbytes_per_sec": 0 00:11:13.039 }, 00:11:13.039 "claimed": true, 00:11:13.039 "claim_type": "exclusive_write", 00:11:13.039 "zoned": false, 00:11:13.039 "supported_io_types": { 00:11:13.039 "read": true, 00:11:13.039 "write": true, 00:11:13.039 "unmap": true, 00:11:13.039 "write_zeroes": true, 00:11:13.039 "flush": true, 00:11:13.039 "reset": true, 00:11:13.039 "compare": false, 00:11:13.039 "compare_and_write": false, 00:11:13.039 "abort": true, 00:11:13.039 "nvme_admin": false, 00:11:13.039 "nvme_io": false 00:11:13.039 }, 00:11:13.039 "memory_domains": [ 00:11:13.039 { 00:11:13.039 "dma_device_id": "system", 00:11:13.039 "dma_device_type": 1 00:11:13.039 }, 00:11:13.039 { 00:11:13.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.039 "dma_device_type": 2 00:11:13.039 } 00:11:13.039 ], 00:11:13.039 "driver_specific": { 00:11:13.039 "passthru": { 00:11:13.039 "name": "pt2", 00:11:13.039 "base_bdev_name": "malloc2" 00:11:13.039 } 00:11:13.039 } 00:11:13.039 }' 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:13.039 04:12:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:13.039 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:13.298 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:13.555 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:13.556 "name": "pt3", 00:11:13.556 "aliases": [ 00:11:13.556 "13286314-d3f7-5170-ba0c-1a055774172f" 00:11:13.556 ], 00:11:13.556 "product_name": "passthru", 00:11:13.556 "block_size": 512, 00:11:13.556 "num_blocks": 65536, 00:11:13.556 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:13.556 "assigned_rate_limits": { 00:11:13.556 "rw_ios_per_sec": 0, 00:11:13.556 "rw_mbytes_per_sec": 0, 00:11:13.556 "r_mbytes_per_sec": 0, 00:11:13.556 "w_mbytes_per_sec": 0 00:11:13.556 }, 00:11:13.556 "claimed": true, 00:11:13.556 "claim_type": "exclusive_write", 00:11:13.556 "zoned": false, 00:11:13.556 "supported_io_types": { 00:11:13.556 "read": true, 00:11:13.556 "write": true, 00:11:13.556 "unmap": true, 00:11:13.556 "write_zeroes": true, 00:11:13.556 "flush": true, 00:11:13.556 "reset": true, 00:11:13.556 "compare": false, 00:11:13.556 "compare_and_write": false, 00:11:13.556 "abort": true, 00:11:13.556 "nvme_admin": false, 00:11:13.556 "nvme_io": false 00:11:13.556 }, 00:11:13.556 "memory_domains": [ 00:11:13.556 { 00:11:13.556 "dma_device_id": "system", 00:11:13.556 "dma_device_type": 1 00:11:13.556 }, 00:11:13.556 { 00:11:13.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:13.556 "dma_device_type": 2 00:11:13.556 } 00:11:13.556 ], 00:11:13.556 "driver_specific": { 00:11:13.556 "passthru": { 00:11:13.556 "name": "pt3", 00:11:13.556 "base_bdev_name": "malloc3" 00:11:13.556 } 00:11:13.556 } 00:11:13.556 }' 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:13.556 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:13.813 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:11:14.071 [2024-05-15 04:12:01.898667] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:14.071 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=0085a034-5f75-4564-97cc-e64299c9203a 00:11:14.071 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 0085a034-5f75-4564-97cc-e64299c9203a ']' 00:11:14.071 04:12:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:14.329 [2024-05-15 04:12:02.155077] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:14.329 [2024-05-15 04:12:02.155121] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:14.329 [2024-05-15 04:12:02.155211] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:14.329 [2024-05-15 04:12:02.155272] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:14.329 [2024-05-15 04:12:02.155286] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25278e0 name raid_bdev1, state offline 00:11:14.329 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.329 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:11:14.587 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:11:14.587 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:11:14.587 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:14.587 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:14.845 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:14.845 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:15.103 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:11:15.103 04:12:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:11:15.361 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:15.361 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:15.618 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:11:15.618 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:15.618 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:15.619 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:11:15.876 [2024-05-15 04:12:03.635049] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:15.876 [2024-05-15 04:12:03.636471] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:15.876 [2024-05-15 04:12:03.636532] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:11:15.876 [2024-05-15 04:12:03.636590] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:15.876 [2024-05-15 04:12:03.636659] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:15.876 [2024-05-15 04:12:03.636687] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:11:15.876 [2024-05-15 04:12:03.636719] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:15.876 [2024-05-15 04:12:03.636732] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2525530 name raid_bdev1, state configuring 00:11:15.876 request: 00:11:15.876 { 00:11:15.876 "name": "raid_bdev1", 00:11:15.876 "raid_level": "raid0", 00:11:15.876 "base_bdevs": [ 00:11:15.876 "malloc1", 00:11:15.876 "malloc2", 00:11:15.876 "malloc3" 00:11:15.876 ], 00:11:15.876 "superblock": false, 00:11:15.876 "strip_size_kb": 64, 00:11:15.876 "method": "bdev_raid_create", 00:11:15.876 "req_id": 1 00:11:15.876 } 00:11:15.876 Got JSON-RPC error response 00:11:15.876 response: 00:11:15.876 { 00:11:15.876 "code": -17, 00:11:15.876 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:15.876 } 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.876 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:11:16.133 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:11:16.133 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:11:16.133 04:12:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:16.133 [2024-05-15 04:12:04.140325] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:16.133 [2024-05-15 04:12:04.140387] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:16.133 [2024-05-15 04:12:04.140413] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26dedd0 00:11:16.133 [2024-05-15 04:12:04.140427] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:16.133 [2024-05-15 04:12:04.141985] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:16.133 [2024-05-15 04:12:04.142010] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:16.133 [2024-05-15 04:12:04.142097] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:16.133 [2024-05-15 04:12:04.142151] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:16.133 pt1 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:16.390 "name": "raid_bdev1", 00:11:16.390 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:16.390 "strip_size_kb": 64, 00:11:16.390 "state": "configuring", 00:11:16.390 "raid_level": "raid0", 00:11:16.390 "superblock": true, 00:11:16.390 "num_base_bdevs": 3, 00:11:16.390 "num_base_bdevs_discovered": 1, 00:11:16.390 "num_base_bdevs_operational": 3, 00:11:16.390 "base_bdevs_list": [ 00:11:16.390 { 00:11:16.390 "name": "pt1", 00:11:16.390 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:16.390 "is_configured": true, 00:11:16.390 "data_offset": 2048, 00:11:16.390 "data_size": 63488 00:11:16.390 }, 00:11:16.390 { 00:11:16.390 "name": null, 00:11:16.390 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:16.390 "is_configured": false, 00:11:16.390 "data_offset": 2048, 00:11:16.390 "data_size": 63488 00:11:16.390 }, 00:11:16.390 { 00:11:16.390 "name": null, 00:11:16.390 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:16.390 "is_configured": false, 00:11:16.390 "data_offset": 2048, 00:11:16.390 "data_size": 63488 00:11:16.390 } 00:11:16.390 ] 00:11:16.390 }' 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:16.390 04:12:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.323 04:12:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:11:17.323 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:17.323 [2024-05-15 04:12:05.219180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:17.323 [2024-05-15 04:12:05.219257] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:17.323 [2024-05-15 04:12:05.219285] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2523e30 00:11:17.323 [2024-05-15 04:12:05.219301] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:17.323 [2024-05-15 04:12:05.219715] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:17.323 [2024-05-15 04:12:05.219743] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:17.323 [2024-05-15 04:12:05.219835] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:17.323 [2024-05-15 04:12:05.219866] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:17.323 pt2 00:11:17.323 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:17.580 [2024-05-15 04:12:05.467914] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.581 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:17.838 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:17.838 "name": "raid_bdev1", 00:11:17.838 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:17.838 "strip_size_kb": 64, 00:11:17.838 "state": "configuring", 00:11:17.838 "raid_level": "raid0", 00:11:17.838 "superblock": true, 00:11:17.838 "num_base_bdevs": 3, 00:11:17.838 "num_base_bdevs_discovered": 1, 00:11:17.838 "num_base_bdevs_operational": 3, 00:11:17.838 "base_bdevs_list": [ 00:11:17.838 { 00:11:17.838 "name": "pt1", 00:11:17.838 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:17.838 "is_configured": true, 00:11:17.838 "data_offset": 2048, 00:11:17.838 "data_size": 63488 00:11:17.838 }, 00:11:17.838 { 00:11:17.838 "name": null, 00:11:17.838 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:17.838 "is_configured": false, 00:11:17.838 "data_offset": 2048, 00:11:17.838 "data_size": 63488 00:11:17.838 }, 00:11:17.838 { 00:11:17.838 "name": null, 00:11:17.838 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:17.838 "is_configured": false, 00:11:17.838 "data_offset": 2048, 00:11:17.838 "data_size": 63488 00:11:17.838 } 00:11:17.838 ] 00:11:17.838 }' 00:11:17.838 04:12:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:17.838 04:12:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:18.422 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:11:18.422 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:18.422 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:18.684 [2024-05-15 04:12:06.522684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:18.684 [2024-05-15 04:12:06.522763] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:18.684 [2024-05-15 04:12:06.522797] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252c5e0 00:11:18.684 [2024-05-15 04:12:06.522814] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:18.684 [2024-05-15 04:12:06.523240] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:18.684 [2024-05-15 04:12:06.523267] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:18.684 [2024-05-15 04:12:06.523352] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:18.684 [2024-05-15 04:12:06.523382] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:18.684 pt2 00:11:18.684 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:11:18.684 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:18.684 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:11:18.949 [2024-05-15 04:12:06.803452] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:11:18.949 [2024-05-15 04:12:06.803520] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:18.949 [2024-05-15 04:12:06.803547] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252adc0 00:11:18.949 [2024-05-15 04:12:06.803563] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:18.949 [2024-05-15 04:12:06.803981] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:18.949 [2024-05-15 04:12:06.804009] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:11:18.949 [2024-05-15 04:12:06.804094] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:11:18.949 [2024-05-15 04:12:06.804124] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:11:18.949 [2024-05-15 04:12:06.804264] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x252cf80 00:11:18.949 [2024-05-15 04:12:06.804280] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:18.949 [2024-05-15 04:12:06.804450] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252cd50 00:11:18.949 [2024-05-15 04:12:06.804612] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252cf80 00:11:18.949 [2024-05-15 04:12:06.804628] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252cf80 00:11:18.949 [2024-05-15 04:12:06.804738] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:18.949 pt3 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.949 04:12:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:19.206 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:19.206 "name": "raid_bdev1", 00:11:19.206 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:19.206 "strip_size_kb": 64, 00:11:19.206 "state": "online", 00:11:19.206 "raid_level": "raid0", 00:11:19.206 "superblock": true, 00:11:19.206 "num_base_bdevs": 3, 00:11:19.206 "num_base_bdevs_discovered": 3, 00:11:19.206 "num_base_bdevs_operational": 3, 00:11:19.206 "base_bdevs_list": [ 00:11:19.206 { 00:11:19.206 "name": "pt1", 00:11:19.206 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:19.206 "is_configured": true, 00:11:19.206 "data_offset": 2048, 00:11:19.206 "data_size": 63488 00:11:19.206 }, 00:11:19.206 { 00:11:19.206 "name": "pt2", 00:11:19.206 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:19.206 "is_configured": true, 00:11:19.206 "data_offset": 2048, 00:11:19.206 "data_size": 63488 00:11:19.206 }, 00:11:19.206 { 00:11:19.206 "name": "pt3", 00:11:19.206 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:19.206 "is_configured": true, 00:11:19.206 "data_offset": 2048, 00:11:19.206 "data_size": 63488 00:11:19.206 } 00:11:19.206 ] 00:11:19.206 }' 00:11:19.206 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:19.206 04:12:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:19.769 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:20.026 [2024-05-15 04:12:07.842533] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:20.026 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:20.026 "name": "raid_bdev1", 00:11:20.026 "aliases": [ 00:11:20.026 "0085a034-5f75-4564-97cc-e64299c9203a" 00:11:20.026 ], 00:11:20.026 "product_name": "Raid Volume", 00:11:20.026 "block_size": 512, 00:11:20.026 "num_blocks": 190464, 00:11:20.026 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:20.026 "assigned_rate_limits": { 00:11:20.026 "rw_ios_per_sec": 0, 00:11:20.026 "rw_mbytes_per_sec": 0, 00:11:20.026 "r_mbytes_per_sec": 0, 00:11:20.026 "w_mbytes_per_sec": 0 00:11:20.026 }, 00:11:20.026 "claimed": false, 00:11:20.026 "zoned": false, 00:11:20.026 "supported_io_types": { 00:11:20.026 "read": true, 00:11:20.026 "write": true, 00:11:20.026 "unmap": true, 00:11:20.026 "write_zeroes": true, 00:11:20.026 "flush": true, 00:11:20.026 "reset": true, 00:11:20.026 "compare": false, 00:11:20.026 "compare_and_write": false, 00:11:20.026 "abort": false, 00:11:20.026 "nvme_admin": false, 00:11:20.026 "nvme_io": false 00:11:20.026 }, 00:11:20.026 "memory_domains": [ 00:11:20.026 { 00:11:20.026 "dma_device_id": "system", 00:11:20.026 "dma_device_type": 1 00:11:20.026 }, 00:11:20.026 { 00:11:20.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.026 "dma_device_type": 2 00:11:20.026 }, 00:11:20.026 { 00:11:20.026 "dma_device_id": "system", 00:11:20.026 "dma_device_type": 1 00:11:20.026 }, 00:11:20.026 { 00:11:20.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.026 "dma_device_type": 2 00:11:20.026 }, 00:11:20.026 { 00:11:20.026 "dma_device_id": "system", 00:11:20.026 "dma_device_type": 1 00:11:20.027 }, 00:11:20.027 { 00:11:20.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.027 "dma_device_type": 2 00:11:20.027 } 00:11:20.027 ], 00:11:20.027 "driver_specific": { 00:11:20.027 "raid": { 00:11:20.027 "uuid": "0085a034-5f75-4564-97cc-e64299c9203a", 00:11:20.027 "strip_size_kb": 64, 00:11:20.027 "state": "online", 00:11:20.027 "raid_level": "raid0", 00:11:20.027 "superblock": true, 00:11:20.027 "num_base_bdevs": 3, 00:11:20.027 "num_base_bdevs_discovered": 3, 00:11:20.027 "num_base_bdevs_operational": 3, 00:11:20.027 "base_bdevs_list": [ 00:11:20.027 { 00:11:20.027 "name": "pt1", 00:11:20.027 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:20.027 "is_configured": true, 00:11:20.027 "data_offset": 2048, 00:11:20.027 "data_size": 63488 00:11:20.027 }, 00:11:20.027 { 00:11:20.027 "name": "pt2", 00:11:20.027 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:20.027 "is_configured": true, 00:11:20.027 "data_offset": 2048, 00:11:20.027 "data_size": 63488 00:11:20.027 }, 00:11:20.027 { 00:11:20.027 "name": "pt3", 00:11:20.027 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:20.027 "is_configured": true, 00:11:20.027 "data_offset": 2048, 00:11:20.027 "data_size": 63488 00:11:20.027 } 00:11:20.027 ] 00:11:20.027 } 00:11:20.027 } 00:11:20.027 }' 00:11:20.027 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:20.027 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:11:20.027 pt2 00:11:20.027 pt3' 00:11:20.027 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:20.027 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:20.027 04:12:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:20.284 "name": "pt1", 00:11:20.284 "aliases": [ 00:11:20.284 "71f23649-7b3c-5c40-9246-8c60de017bf6" 00:11:20.284 ], 00:11:20.284 "product_name": "passthru", 00:11:20.284 "block_size": 512, 00:11:20.284 "num_blocks": 65536, 00:11:20.284 "uuid": "71f23649-7b3c-5c40-9246-8c60de017bf6", 00:11:20.284 "assigned_rate_limits": { 00:11:20.284 "rw_ios_per_sec": 0, 00:11:20.284 "rw_mbytes_per_sec": 0, 00:11:20.284 "r_mbytes_per_sec": 0, 00:11:20.284 "w_mbytes_per_sec": 0 00:11:20.284 }, 00:11:20.284 "claimed": true, 00:11:20.284 "claim_type": "exclusive_write", 00:11:20.284 "zoned": false, 00:11:20.284 "supported_io_types": { 00:11:20.284 "read": true, 00:11:20.284 "write": true, 00:11:20.284 "unmap": true, 00:11:20.284 "write_zeroes": true, 00:11:20.284 "flush": true, 00:11:20.284 "reset": true, 00:11:20.284 "compare": false, 00:11:20.284 "compare_and_write": false, 00:11:20.284 "abort": true, 00:11:20.284 "nvme_admin": false, 00:11:20.284 "nvme_io": false 00:11:20.284 }, 00:11:20.284 "memory_domains": [ 00:11:20.284 { 00:11:20.284 "dma_device_id": "system", 00:11:20.284 "dma_device_type": 1 00:11:20.284 }, 00:11:20.284 { 00:11:20.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.284 "dma_device_type": 2 00:11:20.284 } 00:11:20.284 ], 00:11:20.284 "driver_specific": { 00:11:20.284 "passthru": { 00:11:20.284 "name": "pt1", 00:11:20.284 "base_bdev_name": "malloc1" 00:11:20.284 } 00:11:20.284 } 00:11:20.284 }' 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:20.284 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:20.541 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:20.798 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:20.798 "name": "pt2", 00:11:20.798 "aliases": [ 00:11:20.798 "8d36772d-aed4-5991-b40e-6015db67a40f" 00:11:20.798 ], 00:11:20.798 "product_name": "passthru", 00:11:20.798 "block_size": 512, 00:11:20.798 "num_blocks": 65536, 00:11:20.798 "uuid": "8d36772d-aed4-5991-b40e-6015db67a40f", 00:11:20.798 "assigned_rate_limits": { 00:11:20.798 "rw_ios_per_sec": 0, 00:11:20.798 "rw_mbytes_per_sec": 0, 00:11:20.798 "r_mbytes_per_sec": 0, 00:11:20.798 "w_mbytes_per_sec": 0 00:11:20.798 }, 00:11:20.798 "claimed": true, 00:11:20.798 "claim_type": "exclusive_write", 00:11:20.798 "zoned": false, 00:11:20.798 "supported_io_types": { 00:11:20.798 "read": true, 00:11:20.798 "write": true, 00:11:20.798 "unmap": true, 00:11:20.799 "write_zeroes": true, 00:11:20.799 "flush": true, 00:11:20.799 "reset": true, 00:11:20.799 "compare": false, 00:11:20.799 "compare_and_write": false, 00:11:20.799 "abort": true, 00:11:20.799 "nvme_admin": false, 00:11:20.799 "nvme_io": false 00:11:20.799 }, 00:11:20.799 "memory_domains": [ 00:11:20.799 { 00:11:20.799 "dma_device_id": "system", 00:11:20.799 "dma_device_type": 1 00:11:20.799 }, 00:11:20.799 { 00:11:20.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.799 "dma_device_type": 2 00:11:20.799 } 00:11:20.799 ], 00:11:20.799 "driver_specific": { 00:11:20.799 "passthru": { 00:11:20.799 "name": "pt2", 00:11:20.799 "base_bdev_name": "malloc2" 00:11:20.799 } 00:11:20.799 } 00:11:20.799 }' 00:11:20.799 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:20.799 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:20.799 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:20.799 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:21.056 04:12:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:21.056 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:21.056 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:21.056 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:11:21.056 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:21.314 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:21.314 "name": "pt3", 00:11:21.314 "aliases": [ 00:11:21.314 "13286314-d3f7-5170-ba0c-1a055774172f" 00:11:21.314 ], 00:11:21.314 "product_name": "passthru", 00:11:21.314 "block_size": 512, 00:11:21.314 "num_blocks": 65536, 00:11:21.314 "uuid": "13286314-d3f7-5170-ba0c-1a055774172f", 00:11:21.314 "assigned_rate_limits": { 00:11:21.314 "rw_ios_per_sec": 0, 00:11:21.314 "rw_mbytes_per_sec": 0, 00:11:21.314 "r_mbytes_per_sec": 0, 00:11:21.314 "w_mbytes_per_sec": 0 00:11:21.314 }, 00:11:21.314 "claimed": true, 00:11:21.314 "claim_type": "exclusive_write", 00:11:21.314 "zoned": false, 00:11:21.314 "supported_io_types": { 00:11:21.314 "read": true, 00:11:21.314 "write": true, 00:11:21.314 "unmap": true, 00:11:21.314 "write_zeroes": true, 00:11:21.314 "flush": true, 00:11:21.314 "reset": true, 00:11:21.314 "compare": false, 00:11:21.314 "compare_and_write": false, 00:11:21.314 "abort": true, 00:11:21.314 "nvme_admin": false, 00:11:21.314 "nvme_io": false 00:11:21.314 }, 00:11:21.314 "memory_domains": [ 00:11:21.314 { 00:11:21.314 "dma_device_id": "system", 00:11:21.314 "dma_device_type": 1 00:11:21.314 }, 00:11:21.314 { 00:11:21.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.314 "dma_device_type": 2 00:11:21.314 } 00:11:21.314 ], 00:11:21.314 "driver_specific": { 00:11:21.314 "passthru": { 00:11:21.314 "name": "pt3", 00:11:21.314 "base_bdev_name": "malloc3" 00:11:21.314 } 00:11:21.314 } 00:11:21.314 }' 00:11:21.314 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:21.314 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:21.571 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:11:21.829 [2024-05-15 04:12:09.771817] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 0085a034-5f75-4564-97cc-e64299c9203a '!=' 0085a034-5f75-4564-97cc-e64299c9203a ']' 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3846035 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3846035 ']' 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3846035 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3846035 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3846035' 00:11:21.829 killing process with pid 3846035 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3846035 00:11:21.829 [2024-05-15 04:12:09.815504] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:21.829 04:12:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3846035 00:11:21.829 [2024-05-15 04:12:09.815595] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:21.830 [2024-05-15 04:12:09.815666] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:21.830 [2024-05-15 04:12:09.815682] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252cf80 name raid_bdev1, state offline 00:11:22.087 [2024-05-15 04:12:09.850662] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:22.346 04:12:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:11:22.346 00:11:22.346 real 0m14.092s 00:11:22.346 user 0m25.758s 00:11:22.346 sys 0m1.960s 00:11:22.346 04:12:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:22.346 04:12:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.346 ************************************ 00:11:22.346 END TEST raid_superblock_test 00:11:22.346 ************************************ 00:11:22.346 04:12:10 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:11:22.346 04:12:10 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:11:22.346 04:12:10 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:22.346 04:12:10 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:22.346 04:12:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:22.346 ************************************ 00:11:22.346 START TEST raid_state_function_test 00:11:22.346 ************************************ 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 false 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3848665 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3848665' 00:11:22.346 Process raid pid: 3848665 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3848665 /var/tmp/spdk-raid.sock 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3848665 ']' 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:22.346 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:22.346 04:12:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.346 [2024-05-15 04:12:10.230565] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:11:22.346 [2024-05-15 04:12:10.230645] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:22.346 [2024-05-15 04:12:10.310880] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:22.605 [2024-05-15 04:12:10.428340] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:22.605 [2024-05-15 04:12:10.500247] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:22.605 [2024-05-15 04:12:10.500287] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:23.537 [2024-05-15 04:12:11.406867] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:23.537 [2024-05-15 04:12:11.406911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:23.537 [2024-05-15 04:12:11.406923] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:23.537 [2024-05-15 04:12:11.406934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:23.537 [2024-05-15 04:12:11.406941] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:23.537 [2024-05-15 04:12:11.406952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.537 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:23.795 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:23.795 "name": "Existed_Raid", 00:11:23.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.795 "strip_size_kb": 64, 00:11:23.795 "state": "configuring", 00:11:23.795 "raid_level": "concat", 00:11:23.795 "superblock": false, 00:11:23.795 "num_base_bdevs": 3, 00:11:23.795 "num_base_bdevs_discovered": 0, 00:11:23.795 "num_base_bdevs_operational": 3, 00:11:23.795 "base_bdevs_list": [ 00:11:23.795 { 00:11:23.795 "name": "BaseBdev1", 00:11:23.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.795 "is_configured": false, 00:11:23.795 "data_offset": 0, 00:11:23.795 "data_size": 0 00:11:23.795 }, 00:11:23.795 { 00:11:23.795 "name": "BaseBdev2", 00:11:23.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.795 "is_configured": false, 00:11:23.795 "data_offset": 0, 00:11:23.795 "data_size": 0 00:11:23.795 }, 00:11:23.795 { 00:11:23.795 "name": "BaseBdev3", 00:11:23.795 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:23.795 "is_configured": false, 00:11:23.795 "data_offset": 0, 00:11:23.795 "data_size": 0 00:11:23.795 } 00:11:23.795 ] 00:11:23.795 }' 00:11:23.795 04:12:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:23.795 04:12:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.360 04:12:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:24.619 [2024-05-15 04:12:12.497663] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:24.619 [2024-05-15 04:12:12.497696] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2414020 name Existed_Raid, state configuring 00:11:24.619 04:12:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:24.877 [2024-05-15 04:12:12.750339] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:24.877 [2024-05-15 04:12:12.750380] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:24.877 [2024-05-15 04:12:12.750390] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:24.877 [2024-05-15 04:12:12.750401] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:24.877 [2024-05-15 04:12:12.750409] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:24.877 [2024-05-15 04:12:12.750419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:24.877 04:12:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:25.135 [2024-05-15 04:12:13.011586] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:25.135 BaseBdev1 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:25.135 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:25.393 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:25.650 [ 00:11:25.650 { 00:11:25.650 "name": "BaseBdev1", 00:11:25.650 "aliases": [ 00:11:25.650 "f6718ef8-b556-4d17-988c-c8c72aa0a45e" 00:11:25.650 ], 00:11:25.650 "product_name": "Malloc disk", 00:11:25.650 "block_size": 512, 00:11:25.650 "num_blocks": 65536, 00:11:25.650 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:25.650 "assigned_rate_limits": { 00:11:25.650 "rw_ios_per_sec": 0, 00:11:25.650 "rw_mbytes_per_sec": 0, 00:11:25.650 "r_mbytes_per_sec": 0, 00:11:25.650 "w_mbytes_per_sec": 0 00:11:25.650 }, 00:11:25.650 "claimed": true, 00:11:25.650 "claim_type": "exclusive_write", 00:11:25.650 "zoned": false, 00:11:25.650 "supported_io_types": { 00:11:25.650 "read": true, 00:11:25.650 "write": true, 00:11:25.650 "unmap": true, 00:11:25.650 "write_zeroes": true, 00:11:25.650 "flush": true, 00:11:25.650 "reset": true, 00:11:25.650 "compare": false, 00:11:25.650 "compare_and_write": false, 00:11:25.650 "abort": true, 00:11:25.650 "nvme_admin": false, 00:11:25.650 "nvme_io": false 00:11:25.650 }, 00:11:25.650 "memory_domains": [ 00:11:25.650 { 00:11:25.650 "dma_device_id": "system", 00:11:25.650 "dma_device_type": 1 00:11:25.650 }, 00:11:25.650 { 00:11:25.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:25.650 "dma_device_type": 2 00:11:25.650 } 00:11:25.650 ], 00:11:25.650 "driver_specific": {} 00:11:25.650 } 00:11:25.650 ] 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:25.650 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.651 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.908 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:25.908 "name": "Existed_Raid", 00:11:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.908 "strip_size_kb": 64, 00:11:25.908 "state": "configuring", 00:11:25.908 "raid_level": "concat", 00:11:25.908 "superblock": false, 00:11:25.908 "num_base_bdevs": 3, 00:11:25.908 "num_base_bdevs_discovered": 1, 00:11:25.908 "num_base_bdevs_operational": 3, 00:11:25.908 "base_bdevs_list": [ 00:11:25.908 { 00:11:25.908 "name": "BaseBdev1", 00:11:25.908 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:25.908 "is_configured": true, 00:11:25.908 "data_offset": 0, 00:11:25.908 "data_size": 65536 00:11:25.908 }, 00:11:25.908 { 00:11:25.908 "name": "BaseBdev2", 00:11:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.908 "is_configured": false, 00:11:25.908 "data_offset": 0, 00:11:25.908 "data_size": 0 00:11:25.908 }, 00:11:25.908 { 00:11:25.908 "name": "BaseBdev3", 00:11:25.908 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.908 "is_configured": false, 00:11:25.908 "data_offset": 0, 00:11:25.908 "data_size": 0 00:11:25.908 } 00:11:25.908 ] 00:11:25.908 }' 00:11:25.908 04:12:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:25.909 04:12:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.474 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:26.731 [2024-05-15 04:12:14.583714] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:26.731 [2024-05-15 04:12:14.583763] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24138f0 name Existed_Raid, state configuring 00:11:26.731 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:26.989 [2024-05-15 04:12:14.824390] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:26.989 [2024-05-15 04:12:14.825910] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:26.989 [2024-05-15 04:12:14.825941] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:26.989 [2024-05-15 04:12:14.825962] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:26.989 [2024-05-15 04:12:14.825973] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.989 04:12:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:27.247 04:12:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:27.247 "name": "Existed_Raid", 00:11:27.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.247 "strip_size_kb": 64, 00:11:27.247 "state": "configuring", 00:11:27.247 "raid_level": "concat", 00:11:27.247 "superblock": false, 00:11:27.247 "num_base_bdevs": 3, 00:11:27.247 "num_base_bdevs_discovered": 1, 00:11:27.247 "num_base_bdevs_operational": 3, 00:11:27.247 "base_bdevs_list": [ 00:11:27.247 { 00:11:27.247 "name": "BaseBdev1", 00:11:27.247 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:27.247 "is_configured": true, 00:11:27.247 "data_offset": 0, 00:11:27.247 "data_size": 65536 00:11:27.247 }, 00:11:27.247 { 00:11:27.247 "name": "BaseBdev2", 00:11:27.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.247 "is_configured": false, 00:11:27.247 "data_offset": 0, 00:11:27.247 "data_size": 0 00:11:27.247 }, 00:11:27.247 { 00:11:27.247 "name": "BaseBdev3", 00:11:27.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:27.247 "is_configured": false, 00:11:27.247 "data_offset": 0, 00:11:27.247 "data_size": 0 00:11:27.247 } 00:11:27.247 ] 00:11:27.247 }' 00:11:27.247 04:12:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:27.247 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.812 04:12:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:28.070 [2024-05-15 04:12:15.865386] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:28.070 BaseBdev2 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:28.070 04:12:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:28.327 04:12:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:28.585 [ 00:11:28.585 { 00:11:28.585 "name": "BaseBdev2", 00:11:28.585 "aliases": [ 00:11:28.585 "6094a2fc-afc2-4ddc-a6a5-e909f76f8189" 00:11:28.585 ], 00:11:28.585 "product_name": "Malloc disk", 00:11:28.585 "block_size": 512, 00:11:28.585 "num_blocks": 65536, 00:11:28.585 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:28.585 "assigned_rate_limits": { 00:11:28.585 "rw_ios_per_sec": 0, 00:11:28.585 "rw_mbytes_per_sec": 0, 00:11:28.585 "r_mbytes_per_sec": 0, 00:11:28.585 "w_mbytes_per_sec": 0 00:11:28.585 }, 00:11:28.585 "claimed": true, 00:11:28.585 "claim_type": "exclusive_write", 00:11:28.585 "zoned": false, 00:11:28.585 "supported_io_types": { 00:11:28.585 "read": true, 00:11:28.585 "write": true, 00:11:28.585 "unmap": true, 00:11:28.585 "write_zeroes": true, 00:11:28.585 "flush": true, 00:11:28.585 "reset": true, 00:11:28.585 "compare": false, 00:11:28.585 "compare_and_write": false, 00:11:28.585 "abort": true, 00:11:28.585 "nvme_admin": false, 00:11:28.585 "nvme_io": false 00:11:28.585 }, 00:11:28.585 "memory_domains": [ 00:11:28.585 { 00:11:28.585 "dma_device_id": "system", 00:11:28.585 "dma_device_type": 1 00:11:28.585 }, 00:11:28.585 { 00:11:28.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.585 "dma_device_type": 2 00:11:28.585 } 00:11:28.585 ], 00:11:28.585 "driver_specific": {} 00:11:28.585 } 00:11:28.585 ] 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:28.585 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:28.586 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:28.586 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:28.586 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.843 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:28.843 "name": "Existed_Raid", 00:11:28.843 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.843 "strip_size_kb": 64, 00:11:28.843 "state": "configuring", 00:11:28.843 "raid_level": "concat", 00:11:28.843 "superblock": false, 00:11:28.843 "num_base_bdevs": 3, 00:11:28.844 "num_base_bdevs_discovered": 2, 00:11:28.844 "num_base_bdevs_operational": 3, 00:11:28.844 "base_bdevs_list": [ 00:11:28.844 { 00:11:28.844 "name": "BaseBdev1", 00:11:28.844 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:28.844 "is_configured": true, 00:11:28.844 "data_offset": 0, 00:11:28.844 "data_size": 65536 00:11:28.844 }, 00:11:28.844 { 00:11:28.844 "name": "BaseBdev2", 00:11:28.844 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:28.844 "is_configured": true, 00:11:28.844 "data_offset": 0, 00:11:28.844 "data_size": 65536 00:11:28.844 }, 00:11:28.844 { 00:11:28.844 "name": "BaseBdev3", 00:11:28.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.844 "is_configured": false, 00:11:28.844 "data_offset": 0, 00:11:28.844 "data_size": 0 00:11:28.844 } 00:11:28.844 ] 00:11:28.844 }' 00:11:28.844 04:12:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:28.844 04:12:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:29.410 [2024-05-15 04:12:17.375164] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:29.410 [2024-05-15 04:12:17.375222] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x24147e0 00:11:29.410 [2024-05-15 04:12:17.375233] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:29.410 [2024-05-15 04:12:17.375433] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242b6d0 00:11:29.410 [2024-05-15 04:12:17.375596] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24147e0 00:11:29.410 [2024-05-15 04:12:17.375613] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24147e0 00:11:29.410 [2024-05-15 04:12:17.375843] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:29.410 BaseBdev3 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:29.410 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:29.667 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:29.926 [ 00:11:29.926 { 00:11:29.926 "name": "BaseBdev3", 00:11:29.926 "aliases": [ 00:11:29.926 "53ddb64d-a270-4be4-9632-d29b342cf472" 00:11:29.926 ], 00:11:29.926 "product_name": "Malloc disk", 00:11:29.926 "block_size": 512, 00:11:29.926 "num_blocks": 65536, 00:11:29.926 "uuid": "53ddb64d-a270-4be4-9632-d29b342cf472", 00:11:29.926 "assigned_rate_limits": { 00:11:29.926 "rw_ios_per_sec": 0, 00:11:29.926 "rw_mbytes_per_sec": 0, 00:11:29.926 "r_mbytes_per_sec": 0, 00:11:29.926 "w_mbytes_per_sec": 0 00:11:29.926 }, 00:11:29.926 "claimed": true, 00:11:29.926 "claim_type": "exclusive_write", 00:11:29.926 "zoned": false, 00:11:29.926 "supported_io_types": { 00:11:29.926 "read": true, 00:11:29.926 "write": true, 00:11:29.926 "unmap": true, 00:11:29.926 "write_zeroes": true, 00:11:29.926 "flush": true, 00:11:29.926 "reset": true, 00:11:29.926 "compare": false, 00:11:29.926 "compare_and_write": false, 00:11:29.926 "abort": true, 00:11:29.926 "nvme_admin": false, 00:11:29.926 "nvme_io": false 00:11:29.926 }, 00:11:29.926 "memory_domains": [ 00:11:29.926 { 00:11:29.926 "dma_device_id": "system", 00:11:29.926 "dma_device_type": 1 00:11:29.926 }, 00:11:29.926 { 00:11:29.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.926 "dma_device_type": 2 00:11:29.926 } 00:11:29.926 ], 00:11:29.926 "driver_specific": {} 00:11:29.926 } 00:11:29.926 ] 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.926 04:12:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.184 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:30.184 "name": "Existed_Raid", 00:11:30.184 "uuid": "12f0889d-4a8d-4268-9034-40150b38d3a3", 00:11:30.184 "strip_size_kb": 64, 00:11:30.184 "state": "online", 00:11:30.184 "raid_level": "concat", 00:11:30.184 "superblock": false, 00:11:30.184 "num_base_bdevs": 3, 00:11:30.184 "num_base_bdevs_discovered": 3, 00:11:30.184 "num_base_bdevs_operational": 3, 00:11:30.185 "base_bdevs_list": [ 00:11:30.185 { 00:11:30.185 "name": "BaseBdev1", 00:11:30.185 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:30.185 "is_configured": true, 00:11:30.185 "data_offset": 0, 00:11:30.185 "data_size": 65536 00:11:30.185 }, 00:11:30.185 { 00:11:30.185 "name": "BaseBdev2", 00:11:30.185 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:30.185 "is_configured": true, 00:11:30.185 "data_offset": 0, 00:11:30.185 "data_size": 65536 00:11:30.185 }, 00:11:30.185 { 00:11:30.185 "name": "BaseBdev3", 00:11:30.185 "uuid": "53ddb64d-a270-4be4-9632-d29b342cf472", 00:11:30.185 "is_configured": true, 00:11:30.185 "data_offset": 0, 00:11:30.185 "data_size": 65536 00:11:30.185 } 00:11:30.185 ] 00:11:30.185 }' 00:11:30.185 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:30.185 04:12:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:30.751 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:31.009 [2024-05-15 04:12:18.935551] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:31.009 "name": "Existed_Raid", 00:11:31.009 "aliases": [ 00:11:31.009 "12f0889d-4a8d-4268-9034-40150b38d3a3" 00:11:31.009 ], 00:11:31.009 "product_name": "Raid Volume", 00:11:31.009 "block_size": 512, 00:11:31.009 "num_blocks": 196608, 00:11:31.009 "uuid": "12f0889d-4a8d-4268-9034-40150b38d3a3", 00:11:31.009 "assigned_rate_limits": { 00:11:31.009 "rw_ios_per_sec": 0, 00:11:31.009 "rw_mbytes_per_sec": 0, 00:11:31.009 "r_mbytes_per_sec": 0, 00:11:31.009 "w_mbytes_per_sec": 0 00:11:31.009 }, 00:11:31.009 "claimed": false, 00:11:31.009 "zoned": false, 00:11:31.009 "supported_io_types": { 00:11:31.009 "read": true, 00:11:31.009 "write": true, 00:11:31.009 "unmap": true, 00:11:31.009 "write_zeroes": true, 00:11:31.009 "flush": true, 00:11:31.009 "reset": true, 00:11:31.009 "compare": false, 00:11:31.009 "compare_and_write": false, 00:11:31.009 "abort": false, 00:11:31.009 "nvme_admin": false, 00:11:31.009 "nvme_io": false 00:11:31.009 }, 00:11:31.009 "memory_domains": [ 00:11:31.009 { 00:11:31.009 "dma_device_id": "system", 00:11:31.009 "dma_device_type": 1 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.009 "dma_device_type": 2 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "dma_device_id": "system", 00:11:31.009 "dma_device_type": 1 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.009 "dma_device_type": 2 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "dma_device_id": "system", 00:11:31.009 "dma_device_type": 1 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.009 "dma_device_type": 2 00:11:31.009 } 00:11:31.009 ], 00:11:31.009 "driver_specific": { 00:11:31.009 "raid": { 00:11:31.009 "uuid": "12f0889d-4a8d-4268-9034-40150b38d3a3", 00:11:31.009 "strip_size_kb": 64, 00:11:31.009 "state": "online", 00:11:31.009 "raid_level": "concat", 00:11:31.009 "superblock": false, 00:11:31.009 "num_base_bdevs": 3, 00:11:31.009 "num_base_bdevs_discovered": 3, 00:11:31.009 "num_base_bdevs_operational": 3, 00:11:31.009 "base_bdevs_list": [ 00:11:31.009 { 00:11:31.009 "name": "BaseBdev1", 00:11:31.009 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:31.009 "is_configured": true, 00:11:31.009 "data_offset": 0, 00:11:31.009 "data_size": 65536 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "name": "BaseBdev2", 00:11:31.009 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:31.009 "is_configured": true, 00:11:31.009 "data_offset": 0, 00:11:31.009 "data_size": 65536 00:11:31.009 }, 00:11:31.009 { 00:11:31.009 "name": "BaseBdev3", 00:11:31.009 "uuid": "53ddb64d-a270-4be4-9632-d29b342cf472", 00:11:31.009 "is_configured": true, 00:11:31.009 "data_offset": 0, 00:11:31.009 "data_size": 65536 00:11:31.009 } 00:11:31.009 ] 00:11:31.009 } 00:11:31.009 } 00:11:31.009 }' 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:31.009 BaseBdev2 00:11:31.009 BaseBdev3' 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:31.009 04:12:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:31.267 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:31.267 "name": "BaseBdev1", 00:11:31.267 "aliases": [ 00:11:31.267 "f6718ef8-b556-4d17-988c-c8c72aa0a45e" 00:11:31.267 ], 00:11:31.267 "product_name": "Malloc disk", 00:11:31.267 "block_size": 512, 00:11:31.267 "num_blocks": 65536, 00:11:31.267 "uuid": "f6718ef8-b556-4d17-988c-c8c72aa0a45e", 00:11:31.267 "assigned_rate_limits": { 00:11:31.267 "rw_ios_per_sec": 0, 00:11:31.267 "rw_mbytes_per_sec": 0, 00:11:31.267 "r_mbytes_per_sec": 0, 00:11:31.267 "w_mbytes_per_sec": 0 00:11:31.267 }, 00:11:31.267 "claimed": true, 00:11:31.267 "claim_type": "exclusive_write", 00:11:31.267 "zoned": false, 00:11:31.267 "supported_io_types": { 00:11:31.267 "read": true, 00:11:31.267 "write": true, 00:11:31.267 "unmap": true, 00:11:31.267 "write_zeroes": true, 00:11:31.267 "flush": true, 00:11:31.267 "reset": true, 00:11:31.267 "compare": false, 00:11:31.267 "compare_and_write": false, 00:11:31.267 "abort": true, 00:11:31.267 "nvme_admin": false, 00:11:31.267 "nvme_io": false 00:11:31.267 }, 00:11:31.267 "memory_domains": [ 00:11:31.267 { 00:11:31.267 "dma_device_id": "system", 00:11:31.267 "dma_device_type": 1 00:11:31.267 }, 00:11:31.267 { 00:11:31.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.268 "dma_device_type": 2 00:11:31.268 } 00:11:31.268 ], 00:11:31.268 "driver_specific": {} 00:11:31.268 }' 00:11:31.268 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:31.268 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:31.524 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:31.781 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:31.781 "name": "BaseBdev2", 00:11:31.781 "aliases": [ 00:11:31.781 "6094a2fc-afc2-4ddc-a6a5-e909f76f8189" 00:11:31.781 ], 00:11:31.781 "product_name": "Malloc disk", 00:11:31.781 "block_size": 512, 00:11:31.781 "num_blocks": 65536, 00:11:31.781 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:31.781 "assigned_rate_limits": { 00:11:31.781 "rw_ios_per_sec": 0, 00:11:31.781 "rw_mbytes_per_sec": 0, 00:11:31.781 "r_mbytes_per_sec": 0, 00:11:31.781 "w_mbytes_per_sec": 0 00:11:31.781 }, 00:11:31.781 "claimed": true, 00:11:31.781 "claim_type": "exclusive_write", 00:11:31.781 "zoned": false, 00:11:31.781 "supported_io_types": { 00:11:31.781 "read": true, 00:11:31.781 "write": true, 00:11:31.781 "unmap": true, 00:11:31.781 "write_zeroes": true, 00:11:31.781 "flush": true, 00:11:31.781 "reset": true, 00:11:31.781 "compare": false, 00:11:31.781 "compare_and_write": false, 00:11:31.781 "abort": true, 00:11:31.781 "nvme_admin": false, 00:11:31.781 "nvme_io": false 00:11:31.781 }, 00:11:31.781 "memory_domains": [ 00:11:31.781 { 00:11:31.781 "dma_device_id": "system", 00:11:31.781 "dma_device_type": 1 00:11:31.781 }, 00:11:31.781 { 00:11:31.781 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:31.781 "dma_device_type": 2 00:11:31.781 } 00:11:31.781 ], 00:11:31.781 "driver_specific": {} 00:11:31.781 }' 00:11:31.781 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.039 04:12:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:32.039 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:32.297 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:32.297 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:32.297 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:32.297 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:32.556 "name": "BaseBdev3", 00:11:32.556 "aliases": [ 00:11:32.556 "53ddb64d-a270-4be4-9632-d29b342cf472" 00:11:32.556 ], 00:11:32.556 "product_name": "Malloc disk", 00:11:32.556 "block_size": 512, 00:11:32.556 "num_blocks": 65536, 00:11:32.556 "uuid": "53ddb64d-a270-4be4-9632-d29b342cf472", 00:11:32.556 "assigned_rate_limits": { 00:11:32.556 "rw_ios_per_sec": 0, 00:11:32.556 "rw_mbytes_per_sec": 0, 00:11:32.556 "r_mbytes_per_sec": 0, 00:11:32.556 "w_mbytes_per_sec": 0 00:11:32.556 }, 00:11:32.556 "claimed": true, 00:11:32.556 "claim_type": "exclusive_write", 00:11:32.556 "zoned": false, 00:11:32.556 "supported_io_types": { 00:11:32.556 "read": true, 00:11:32.556 "write": true, 00:11:32.556 "unmap": true, 00:11:32.556 "write_zeroes": true, 00:11:32.556 "flush": true, 00:11:32.556 "reset": true, 00:11:32.556 "compare": false, 00:11:32.556 "compare_and_write": false, 00:11:32.556 "abort": true, 00:11:32.556 "nvme_admin": false, 00:11:32.556 "nvme_io": false 00:11:32.556 }, 00:11:32.556 "memory_domains": [ 00:11:32.556 { 00:11:32.556 "dma_device_id": "system", 00:11:32.556 "dma_device_type": 1 00:11:32.556 }, 00:11:32.556 { 00:11:32.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.556 "dma_device_type": 2 00:11:32.556 } 00:11:32.556 ], 00:11:32.556 "driver_specific": {} 00:11:32.556 }' 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:32.556 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:32.815 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:32.815 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:32.815 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:33.074 [2024-05-15 04:12:20.840481] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:33.074 [2024-05-15 04:12:20.840511] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:33.074 [2024-05-15 04:12:20.840566] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:33.074 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.075 04:12:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:33.332 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:33.332 "name": "Existed_Raid", 00:11:33.332 "uuid": "12f0889d-4a8d-4268-9034-40150b38d3a3", 00:11:33.332 "strip_size_kb": 64, 00:11:33.332 "state": "offline", 00:11:33.332 "raid_level": "concat", 00:11:33.332 "superblock": false, 00:11:33.332 "num_base_bdevs": 3, 00:11:33.332 "num_base_bdevs_discovered": 2, 00:11:33.332 "num_base_bdevs_operational": 2, 00:11:33.332 "base_bdevs_list": [ 00:11:33.332 { 00:11:33.332 "name": null, 00:11:33.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:33.332 "is_configured": false, 00:11:33.332 "data_offset": 0, 00:11:33.332 "data_size": 65536 00:11:33.332 }, 00:11:33.332 { 00:11:33.332 "name": "BaseBdev2", 00:11:33.332 "uuid": "6094a2fc-afc2-4ddc-a6a5-e909f76f8189", 00:11:33.332 "is_configured": true, 00:11:33.332 "data_offset": 0, 00:11:33.332 "data_size": 65536 00:11:33.332 }, 00:11:33.332 { 00:11:33.332 "name": "BaseBdev3", 00:11:33.332 "uuid": "53ddb64d-a270-4be4-9632-d29b342cf472", 00:11:33.332 "is_configured": true, 00:11:33.332 "data_offset": 0, 00:11:33.332 "data_size": 65536 00:11:33.332 } 00:11:33.332 ] 00:11:33.332 }' 00:11:33.332 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:33.332 04:12:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:33.898 04:12:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:34.156 [2024-05-15 04:12:22.102401] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:34.156 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:34.156 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:34.156 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.156 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:11:34.414 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:11:34.414 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:34.414 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:11:34.672 [2024-05-15 04:12:22.603682] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:34.672 [2024-05-15 04:12:22.603741] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24147e0 name Existed_Raid, state offline 00:11:34.672 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:11:34.672 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:11:34.672 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.672 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:34.929 04:12:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:35.189 BaseBdev2 00:11:35.189 04:12:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:11:35.189 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:35.189 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:35.189 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:35.190 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:35.190 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:35.190 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:35.483 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:35.741 [ 00:11:35.741 { 00:11:35.741 "name": "BaseBdev2", 00:11:35.741 "aliases": [ 00:11:35.741 "092d16fd-42e3-42b7-bbbe-f3881c544ce8" 00:11:35.741 ], 00:11:35.741 "product_name": "Malloc disk", 00:11:35.741 "block_size": 512, 00:11:35.741 "num_blocks": 65536, 00:11:35.741 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:35.741 "assigned_rate_limits": { 00:11:35.741 "rw_ios_per_sec": 0, 00:11:35.741 "rw_mbytes_per_sec": 0, 00:11:35.741 "r_mbytes_per_sec": 0, 00:11:35.741 "w_mbytes_per_sec": 0 00:11:35.741 }, 00:11:35.741 "claimed": false, 00:11:35.741 "zoned": false, 00:11:35.741 "supported_io_types": { 00:11:35.741 "read": true, 00:11:35.741 "write": true, 00:11:35.741 "unmap": true, 00:11:35.741 "write_zeroes": true, 00:11:35.741 "flush": true, 00:11:35.741 "reset": true, 00:11:35.741 "compare": false, 00:11:35.741 "compare_and_write": false, 00:11:35.741 "abort": true, 00:11:35.741 "nvme_admin": false, 00:11:35.741 "nvme_io": false 00:11:35.741 }, 00:11:35.741 "memory_domains": [ 00:11:35.741 { 00:11:35.741 "dma_device_id": "system", 00:11:35.741 "dma_device_type": 1 00:11:35.741 }, 00:11:35.741 { 00:11:35.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.741 "dma_device_type": 2 00:11:35.741 } 00:11:35.741 ], 00:11:35.741 "driver_specific": {} 00:11:35.741 } 00:11:35.741 ] 00:11:35.741 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:35.741 04:12:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:11:35.741 04:12:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:35.741 04:12:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:35.999 BaseBdev3 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:35.999 04:12:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:36.256 04:12:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:36.514 [ 00:11:36.514 { 00:11:36.514 "name": "BaseBdev3", 00:11:36.514 "aliases": [ 00:11:36.514 "ad109a63-af71-4d79-9416-30068bb2c35d" 00:11:36.514 ], 00:11:36.514 "product_name": "Malloc disk", 00:11:36.514 "block_size": 512, 00:11:36.514 "num_blocks": 65536, 00:11:36.514 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:36.514 "assigned_rate_limits": { 00:11:36.514 "rw_ios_per_sec": 0, 00:11:36.514 "rw_mbytes_per_sec": 0, 00:11:36.514 "r_mbytes_per_sec": 0, 00:11:36.514 "w_mbytes_per_sec": 0 00:11:36.514 }, 00:11:36.514 "claimed": false, 00:11:36.514 "zoned": false, 00:11:36.514 "supported_io_types": { 00:11:36.514 "read": true, 00:11:36.514 "write": true, 00:11:36.514 "unmap": true, 00:11:36.514 "write_zeroes": true, 00:11:36.514 "flush": true, 00:11:36.514 "reset": true, 00:11:36.514 "compare": false, 00:11:36.514 "compare_and_write": false, 00:11:36.514 "abort": true, 00:11:36.514 "nvme_admin": false, 00:11:36.514 "nvme_io": false 00:11:36.514 }, 00:11:36.514 "memory_domains": [ 00:11:36.514 { 00:11:36.515 "dma_device_id": "system", 00:11:36.515 "dma_device_type": 1 00:11:36.515 }, 00:11:36.515 { 00:11:36.515 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:36.515 "dma_device_type": 2 00:11:36.515 } 00:11:36.515 ], 00:11:36.515 "driver_specific": {} 00:11:36.515 } 00:11:36.515 ] 00:11:36.515 04:12:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:36.515 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:11:36.515 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:11:36.515 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:36.773 [2024-05-15 04:12:24.605389] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:36.773 [2024-05-15 04:12:24.605430] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:36.773 [2024-05-15 04:12:24.605455] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:36.773 [2024-05-15 04:12:24.606766] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:36.773 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:37.031 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:37.031 "name": "Existed_Raid", 00:11:37.031 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.031 "strip_size_kb": 64, 00:11:37.031 "state": "configuring", 00:11:37.031 "raid_level": "concat", 00:11:37.031 "superblock": false, 00:11:37.031 "num_base_bdevs": 3, 00:11:37.031 "num_base_bdevs_discovered": 2, 00:11:37.031 "num_base_bdevs_operational": 3, 00:11:37.031 "base_bdevs_list": [ 00:11:37.032 { 00:11:37.032 "name": "BaseBdev1", 00:11:37.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.032 "is_configured": false, 00:11:37.032 "data_offset": 0, 00:11:37.032 "data_size": 0 00:11:37.032 }, 00:11:37.032 { 00:11:37.032 "name": "BaseBdev2", 00:11:37.032 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:37.032 "is_configured": true, 00:11:37.032 "data_offset": 0, 00:11:37.032 "data_size": 65536 00:11:37.032 }, 00:11:37.032 { 00:11:37.032 "name": "BaseBdev3", 00:11:37.032 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:37.032 "is_configured": true, 00:11:37.032 "data_offset": 0, 00:11:37.032 "data_size": 65536 00:11:37.032 } 00:11:37.032 ] 00:11:37.032 }' 00:11:37.032 04:12:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:37.032 04:12:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:37.597 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:11:37.855 [2024-05-15 04:12:25.632074] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.855 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:38.113 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:38.113 "name": "Existed_Raid", 00:11:38.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.113 "strip_size_kb": 64, 00:11:38.113 "state": "configuring", 00:11:38.113 "raid_level": "concat", 00:11:38.113 "superblock": false, 00:11:38.113 "num_base_bdevs": 3, 00:11:38.113 "num_base_bdevs_discovered": 1, 00:11:38.113 "num_base_bdevs_operational": 3, 00:11:38.113 "base_bdevs_list": [ 00:11:38.113 { 00:11:38.113 "name": "BaseBdev1", 00:11:38.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:38.113 "is_configured": false, 00:11:38.113 "data_offset": 0, 00:11:38.113 "data_size": 0 00:11:38.113 }, 00:11:38.113 { 00:11:38.113 "name": null, 00:11:38.113 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:38.113 "is_configured": false, 00:11:38.113 "data_offset": 0, 00:11:38.113 "data_size": 65536 00:11:38.113 }, 00:11:38.113 { 00:11:38.113 "name": "BaseBdev3", 00:11:38.113 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:38.113 "is_configured": true, 00:11:38.113 "data_offset": 0, 00:11:38.113 "data_size": 65536 00:11:38.113 } 00:11:38.113 ] 00:11:38.113 }' 00:11:38.113 04:12:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:38.113 04:12:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.679 04:12:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.679 04:12:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:38.679 04:12:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:11:38.679 04:12:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:39.245 [2024-05-15 04:12:26.968895] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:39.245 BaseBdev1 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:39.245 04:12:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:39.245 04:12:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:39.504 [ 00:11:39.504 { 00:11:39.504 "name": "BaseBdev1", 00:11:39.504 "aliases": [ 00:11:39.504 "6b50c03a-4ad6-436b-808d-df3b715dba8e" 00:11:39.504 ], 00:11:39.504 "product_name": "Malloc disk", 00:11:39.504 "block_size": 512, 00:11:39.504 "num_blocks": 65536, 00:11:39.504 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:39.504 "assigned_rate_limits": { 00:11:39.504 "rw_ios_per_sec": 0, 00:11:39.504 "rw_mbytes_per_sec": 0, 00:11:39.504 "r_mbytes_per_sec": 0, 00:11:39.504 "w_mbytes_per_sec": 0 00:11:39.504 }, 00:11:39.504 "claimed": true, 00:11:39.504 "claim_type": "exclusive_write", 00:11:39.504 "zoned": false, 00:11:39.504 "supported_io_types": { 00:11:39.504 "read": true, 00:11:39.504 "write": true, 00:11:39.504 "unmap": true, 00:11:39.504 "write_zeroes": true, 00:11:39.504 "flush": true, 00:11:39.504 "reset": true, 00:11:39.504 "compare": false, 00:11:39.504 "compare_and_write": false, 00:11:39.504 "abort": true, 00:11:39.504 "nvme_admin": false, 00:11:39.504 "nvme_io": false 00:11:39.504 }, 00:11:39.504 "memory_domains": [ 00:11:39.504 { 00:11:39.504 "dma_device_id": "system", 00:11:39.504 "dma_device_type": 1 00:11:39.504 }, 00:11:39.504 { 00:11:39.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.504 "dma_device_type": 2 00:11:39.504 } 00:11:39.504 ], 00:11:39.504 "driver_specific": {} 00:11:39.504 } 00:11:39.504 ] 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.504 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:39.762 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:39.762 "name": "Existed_Raid", 00:11:39.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.762 "strip_size_kb": 64, 00:11:39.762 "state": "configuring", 00:11:39.762 "raid_level": "concat", 00:11:39.762 "superblock": false, 00:11:39.762 "num_base_bdevs": 3, 00:11:39.762 "num_base_bdevs_discovered": 2, 00:11:39.762 "num_base_bdevs_operational": 3, 00:11:39.762 "base_bdevs_list": [ 00:11:39.762 { 00:11:39.762 "name": "BaseBdev1", 00:11:39.762 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:39.762 "is_configured": true, 00:11:39.762 "data_offset": 0, 00:11:39.762 "data_size": 65536 00:11:39.762 }, 00:11:39.762 { 00:11:39.762 "name": null, 00:11:39.762 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:39.762 "is_configured": false, 00:11:39.762 "data_offset": 0, 00:11:39.762 "data_size": 65536 00:11:39.762 }, 00:11:39.762 { 00:11:39.762 "name": "BaseBdev3", 00:11:39.762 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:39.762 "is_configured": true, 00:11:39.762 "data_offset": 0, 00:11:39.762 "data_size": 65536 00:11:39.762 } 00:11:39.762 ] 00:11:39.762 }' 00:11:39.762 04:12:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:39.762 04:12:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.329 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.329 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:40.586 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:11:40.586 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:11:40.844 [2024-05-15 04:12:28.745547] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.844 04:12:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:41.103 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:41.103 "name": "Existed_Raid", 00:11:41.103 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:41.103 "strip_size_kb": 64, 00:11:41.103 "state": "configuring", 00:11:41.103 "raid_level": "concat", 00:11:41.103 "superblock": false, 00:11:41.103 "num_base_bdevs": 3, 00:11:41.103 "num_base_bdevs_discovered": 1, 00:11:41.103 "num_base_bdevs_operational": 3, 00:11:41.103 "base_bdevs_list": [ 00:11:41.103 { 00:11:41.103 "name": "BaseBdev1", 00:11:41.103 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:41.103 "is_configured": true, 00:11:41.103 "data_offset": 0, 00:11:41.103 "data_size": 65536 00:11:41.103 }, 00:11:41.103 { 00:11:41.103 "name": null, 00:11:41.103 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:41.103 "is_configured": false, 00:11:41.103 "data_offset": 0, 00:11:41.103 "data_size": 65536 00:11:41.103 }, 00:11:41.103 { 00:11:41.103 "name": null, 00:11:41.103 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:41.103 "is_configured": false, 00:11:41.103 "data_offset": 0, 00:11:41.103 "data_size": 65536 00:11:41.103 } 00:11:41.103 ] 00:11:41.103 }' 00:11:41.103 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:41.103 04:12:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.668 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.668 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:41.926 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:11:41.926 04:12:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:11:42.184 [2024-05-15 04:12:30.012992] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:42.184 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:42.184 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.185 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:42.442 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:42.442 "name": "Existed_Raid", 00:11:42.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:42.442 "strip_size_kb": 64, 00:11:42.442 "state": "configuring", 00:11:42.442 "raid_level": "concat", 00:11:42.442 "superblock": false, 00:11:42.442 "num_base_bdevs": 3, 00:11:42.442 "num_base_bdevs_discovered": 2, 00:11:42.442 "num_base_bdevs_operational": 3, 00:11:42.442 "base_bdevs_list": [ 00:11:42.442 { 00:11:42.442 "name": "BaseBdev1", 00:11:42.442 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:42.442 "is_configured": true, 00:11:42.442 "data_offset": 0, 00:11:42.442 "data_size": 65536 00:11:42.442 }, 00:11:42.442 { 00:11:42.442 "name": null, 00:11:42.442 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:42.442 "is_configured": false, 00:11:42.442 "data_offset": 0, 00:11:42.442 "data_size": 65536 00:11:42.442 }, 00:11:42.442 { 00:11:42.442 "name": "BaseBdev3", 00:11:42.442 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:42.442 "is_configured": true, 00:11:42.442 "data_offset": 0, 00:11:42.442 "data_size": 65536 00:11:42.442 } 00:11:42.442 ] 00:11:42.442 }' 00:11:42.442 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:42.442 04:12:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.008 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.008 04:12:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:11:43.266 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:11:43.266 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:43.524 [2024-05-15 04:12:31.340522] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.524 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:43.781 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:43.781 "name": "Existed_Raid", 00:11:43.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:43.781 "strip_size_kb": 64, 00:11:43.781 "state": "configuring", 00:11:43.781 "raid_level": "concat", 00:11:43.781 "superblock": false, 00:11:43.781 "num_base_bdevs": 3, 00:11:43.781 "num_base_bdevs_discovered": 1, 00:11:43.781 "num_base_bdevs_operational": 3, 00:11:43.781 "base_bdevs_list": [ 00:11:43.781 { 00:11:43.781 "name": null, 00:11:43.781 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:43.781 "is_configured": false, 00:11:43.781 "data_offset": 0, 00:11:43.781 "data_size": 65536 00:11:43.781 }, 00:11:43.781 { 00:11:43.781 "name": null, 00:11:43.781 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:43.781 "is_configured": false, 00:11:43.781 "data_offset": 0, 00:11:43.781 "data_size": 65536 00:11:43.781 }, 00:11:43.781 { 00:11:43.781 "name": "BaseBdev3", 00:11:43.781 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:43.781 "is_configured": true, 00:11:43.782 "data_offset": 0, 00:11:43.782 "data_size": 65536 00:11:43.782 } 00:11:43.782 ] 00:11:43.782 }' 00:11:43.782 04:12:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:43.782 04:12:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.346 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.346 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:11:44.604 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:11:44.604 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:11:44.604 [2024-05-15 04:12:32.619662] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:44.863 "name": "Existed_Raid", 00:11:44.863 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:44.863 "strip_size_kb": 64, 00:11:44.863 "state": "configuring", 00:11:44.863 "raid_level": "concat", 00:11:44.863 "superblock": false, 00:11:44.863 "num_base_bdevs": 3, 00:11:44.863 "num_base_bdevs_discovered": 2, 00:11:44.863 "num_base_bdevs_operational": 3, 00:11:44.863 "base_bdevs_list": [ 00:11:44.863 { 00:11:44.863 "name": null, 00:11:44.863 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:44.863 "is_configured": false, 00:11:44.863 "data_offset": 0, 00:11:44.863 "data_size": 65536 00:11:44.863 }, 00:11:44.863 { 00:11:44.863 "name": "BaseBdev2", 00:11:44.863 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:44.863 "is_configured": true, 00:11:44.863 "data_offset": 0, 00:11:44.863 "data_size": 65536 00:11:44.863 }, 00:11:44.863 { 00:11:44.863 "name": "BaseBdev3", 00:11:44.863 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:44.863 "is_configured": true, 00:11:44.863 "data_offset": 0, 00:11:44.863 "data_size": 65536 00:11:44.863 } 00:11:44.863 ] 00:11:44.863 }' 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:44.863 04:12:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:45.429 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.429 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:11:45.685 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:11:45.685 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:45.685 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:11:45.942 04:12:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6b50c03a-4ad6-436b-808d-df3b715dba8e 00:11:46.200 [2024-05-15 04:12:34.161354] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:11:46.200 [2024-05-15 04:12:34.161407] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b81f0 00:11:46.200 [2024-05-15 04:12:34.161416] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:46.200 [2024-05-15 04:12:34.161586] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bc400 00:11:46.200 [2024-05-15 04:12:34.161707] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b81f0 00:11:46.200 [2024-05-15 04:12:34.161720] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25b81f0 00:11:46.200 [2024-05-15 04:12:34.161933] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:46.200 NewBaseBdev 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:46.200 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:46.458 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:11:46.717 [ 00:11:46.717 { 00:11:46.717 "name": "NewBaseBdev", 00:11:46.717 "aliases": [ 00:11:46.717 "6b50c03a-4ad6-436b-808d-df3b715dba8e" 00:11:46.717 ], 00:11:46.717 "product_name": "Malloc disk", 00:11:46.717 "block_size": 512, 00:11:46.717 "num_blocks": 65536, 00:11:46.717 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:46.717 "assigned_rate_limits": { 00:11:46.717 "rw_ios_per_sec": 0, 00:11:46.717 "rw_mbytes_per_sec": 0, 00:11:46.717 "r_mbytes_per_sec": 0, 00:11:46.717 "w_mbytes_per_sec": 0 00:11:46.717 }, 00:11:46.717 "claimed": true, 00:11:46.717 "claim_type": "exclusive_write", 00:11:46.717 "zoned": false, 00:11:46.717 "supported_io_types": { 00:11:46.717 "read": true, 00:11:46.717 "write": true, 00:11:46.717 "unmap": true, 00:11:46.717 "write_zeroes": true, 00:11:46.717 "flush": true, 00:11:46.717 "reset": true, 00:11:46.717 "compare": false, 00:11:46.717 "compare_and_write": false, 00:11:46.717 "abort": true, 00:11:46.717 "nvme_admin": false, 00:11:46.717 "nvme_io": false 00:11:46.717 }, 00:11:46.717 "memory_domains": [ 00:11:46.717 { 00:11:46.717 "dma_device_id": "system", 00:11:46.717 "dma_device_type": 1 00:11:46.717 }, 00:11:46.717 { 00:11:46.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:46.717 "dma_device_type": 2 00:11:46.717 } 00:11:46.717 ], 00:11:46.717 "driver_specific": {} 00:11:46.717 } 00:11:46.717 ] 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.717 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:46.976 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:46.976 "name": "Existed_Raid", 00:11:46.976 "uuid": "501aeb27-2179-46fd-94b4-a782b1b87a91", 00:11:46.976 "strip_size_kb": 64, 00:11:46.976 "state": "online", 00:11:46.976 "raid_level": "concat", 00:11:46.976 "superblock": false, 00:11:46.976 "num_base_bdevs": 3, 00:11:46.976 "num_base_bdevs_discovered": 3, 00:11:46.976 "num_base_bdevs_operational": 3, 00:11:46.976 "base_bdevs_list": [ 00:11:46.976 { 00:11:46.976 "name": "NewBaseBdev", 00:11:46.976 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:46.976 "is_configured": true, 00:11:46.976 "data_offset": 0, 00:11:46.976 "data_size": 65536 00:11:46.976 }, 00:11:46.976 { 00:11:46.976 "name": "BaseBdev2", 00:11:46.976 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:46.976 "is_configured": true, 00:11:46.976 "data_offset": 0, 00:11:46.976 "data_size": 65536 00:11:46.976 }, 00:11:46.976 { 00:11:46.976 "name": "BaseBdev3", 00:11:46.976 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:46.976 "is_configured": true, 00:11:46.976 "data_offset": 0, 00:11:46.976 "data_size": 65536 00:11:46.976 } 00:11:46.976 ] 00:11:46.976 }' 00:11:46.976 04:12:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:46.976 04:12:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.541 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:47.542 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:47.800 [2024-05-15 04:12:35.669587] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:47.800 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:47.800 "name": "Existed_Raid", 00:11:47.800 "aliases": [ 00:11:47.800 "501aeb27-2179-46fd-94b4-a782b1b87a91" 00:11:47.800 ], 00:11:47.800 "product_name": "Raid Volume", 00:11:47.800 "block_size": 512, 00:11:47.800 "num_blocks": 196608, 00:11:47.800 "uuid": "501aeb27-2179-46fd-94b4-a782b1b87a91", 00:11:47.800 "assigned_rate_limits": { 00:11:47.800 "rw_ios_per_sec": 0, 00:11:47.800 "rw_mbytes_per_sec": 0, 00:11:47.800 "r_mbytes_per_sec": 0, 00:11:47.800 "w_mbytes_per_sec": 0 00:11:47.800 }, 00:11:47.800 "claimed": false, 00:11:47.800 "zoned": false, 00:11:47.800 "supported_io_types": { 00:11:47.800 "read": true, 00:11:47.800 "write": true, 00:11:47.800 "unmap": true, 00:11:47.800 "write_zeroes": true, 00:11:47.800 "flush": true, 00:11:47.800 "reset": true, 00:11:47.800 "compare": false, 00:11:47.800 "compare_and_write": false, 00:11:47.800 "abort": false, 00:11:47.800 "nvme_admin": false, 00:11:47.800 "nvme_io": false 00:11:47.800 }, 00:11:47.800 "memory_domains": [ 00:11:47.800 { 00:11:47.800 "dma_device_id": "system", 00:11:47.800 "dma_device_type": 1 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.800 "dma_device_type": 2 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "dma_device_id": "system", 00:11:47.800 "dma_device_type": 1 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.800 "dma_device_type": 2 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "dma_device_id": "system", 00:11:47.800 "dma_device_type": 1 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.800 "dma_device_type": 2 00:11:47.800 } 00:11:47.800 ], 00:11:47.800 "driver_specific": { 00:11:47.800 "raid": { 00:11:47.800 "uuid": "501aeb27-2179-46fd-94b4-a782b1b87a91", 00:11:47.800 "strip_size_kb": 64, 00:11:47.800 "state": "online", 00:11:47.800 "raid_level": "concat", 00:11:47.800 "superblock": false, 00:11:47.800 "num_base_bdevs": 3, 00:11:47.800 "num_base_bdevs_discovered": 3, 00:11:47.800 "num_base_bdevs_operational": 3, 00:11:47.800 "base_bdevs_list": [ 00:11:47.800 { 00:11:47.800 "name": "NewBaseBdev", 00:11:47.800 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:47.800 "is_configured": true, 00:11:47.800 "data_offset": 0, 00:11:47.800 "data_size": 65536 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "name": "BaseBdev2", 00:11:47.800 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:47.800 "is_configured": true, 00:11:47.800 "data_offset": 0, 00:11:47.800 "data_size": 65536 00:11:47.800 }, 00:11:47.800 { 00:11:47.800 "name": "BaseBdev3", 00:11:47.800 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:47.800 "is_configured": true, 00:11:47.800 "data_offset": 0, 00:11:47.800 "data_size": 65536 00:11:47.800 } 00:11:47.800 ] 00:11:47.800 } 00:11:47.800 } 00:11:47.800 }' 00:11:47.800 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:47.800 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:11:47.800 BaseBdev2 00:11:47.800 BaseBdev3' 00:11:47.801 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:47.801 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:11:47.801 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:48.059 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:48.059 "name": "NewBaseBdev", 00:11:48.059 "aliases": [ 00:11:48.059 "6b50c03a-4ad6-436b-808d-df3b715dba8e" 00:11:48.059 ], 00:11:48.059 "product_name": "Malloc disk", 00:11:48.059 "block_size": 512, 00:11:48.059 "num_blocks": 65536, 00:11:48.059 "uuid": "6b50c03a-4ad6-436b-808d-df3b715dba8e", 00:11:48.059 "assigned_rate_limits": { 00:11:48.059 "rw_ios_per_sec": 0, 00:11:48.059 "rw_mbytes_per_sec": 0, 00:11:48.059 "r_mbytes_per_sec": 0, 00:11:48.059 "w_mbytes_per_sec": 0 00:11:48.059 }, 00:11:48.059 "claimed": true, 00:11:48.059 "claim_type": "exclusive_write", 00:11:48.059 "zoned": false, 00:11:48.059 "supported_io_types": { 00:11:48.059 "read": true, 00:11:48.059 "write": true, 00:11:48.059 "unmap": true, 00:11:48.059 "write_zeroes": true, 00:11:48.059 "flush": true, 00:11:48.059 "reset": true, 00:11:48.059 "compare": false, 00:11:48.059 "compare_and_write": false, 00:11:48.059 "abort": true, 00:11:48.059 "nvme_admin": false, 00:11:48.059 "nvme_io": false 00:11:48.059 }, 00:11:48.059 "memory_domains": [ 00:11:48.059 { 00:11:48.059 "dma_device_id": "system", 00:11:48.059 "dma_device_type": 1 00:11:48.059 }, 00:11:48.059 { 00:11:48.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.059 "dma_device_type": 2 00:11:48.059 } 00:11:48.059 ], 00:11:48.059 "driver_specific": {} 00:11:48.059 }' 00:11:48.059 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:48.059 04:12:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:48.059 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:48.059 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:48.059 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:48.317 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:48.575 "name": "BaseBdev2", 00:11:48.575 "aliases": [ 00:11:48.575 "092d16fd-42e3-42b7-bbbe-f3881c544ce8" 00:11:48.575 ], 00:11:48.575 "product_name": "Malloc disk", 00:11:48.575 "block_size": 512, 00:11:48.575 "num_blocks": 65536, 00:11:48.575 "uuid": "092d16fd-42e3-42b7-bbbe-f3881c544ce8", 00:11:48.575 "assigned_rate_limits": { 00:11:48.575 "rw_ios_per_sec": 0, 00:11:48.575 "rw_mbytes_per_sec": 0, 00:11:48.575 "r_mbytes_per_sec": 0, 00:11:48.575 "w_mbytes_per_sec": 0 00:11:48.575 }, 00:11:48.575 "claimed": true, 00:11:48.575 "claim_type": "exclusive_write", 00:11:48.575 "zoned": false, 00:11:48.575 "supported_io_types": { 00:11:48.575 "read": true, 00:11:48.575 "write": true, 00:11:48.575 "unmap": true, 00:11:48.575 "write_zeroes": true, 00:11:48.575 "flush": true, 00:11:48.575 "reset": true, 00:11:48.575 "compare": false, 00:11:48.575 "compare_and_write": false, 00:11:48.575 "abort": true, 00:11:48.575 "nvme_admin": false, 00:11:48.575 "nvme_io": false 00:11:48.575 }, 00:11:48.575 "memory_domains": [ 00:11:48.575 { 00:11:48.575 "dma_device_id": "system", 00:11:48.575 "dma_device_type": 1 00:11:48.575 }, 00:11:48.575 { 00:11:48.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:48.575 "dma_device_type": 2 00:11:48.575 } 00:11:48.575 ], 00:11:48.575 "driver_specific": {} 00:11:48.575 }' 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:48.575 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:48.833 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:49.091 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:49.091 "name": "BaseBdev3", 00:11:49.091 "aliases": [ 00:11:49.091 "ad109a63-af71-4d79-9416-30068bb2c35d" 00:11:49.091 ], 00:11:49.091 "product_name": "Malloc disk", 00:11:49.091 "block_size": 512, 00:11:49.091 "num_blocks": 65536, 00:11:49.091 "uuid": "ad109a63-af71-4d79-9416-30068bb2c35d", 00:11:49.091 "assigned_rate_limits": { 00:11:49.091 "rw_ios_per_sec": 0, 00:11:49.091 "rw_mbytes_per_sec": 0, 00:11:49.091 "r_mbytes_per_sec": 0, 00:11:49.091 "w_mbytes_per_sec": 0 00:11:49.091 }, 00:11:49.091 "claimed": true, 00:11:49.091 "claim_type": "exclusive_write", 00:11:49.091 "zoned": false, 00:11:49.091 "supported_io_types": { 00:11:49.091 "read": true, 00:11:49.091 "write": true, 00:11:49.091 "unmap": true, 00:11:49.091 "write_zeroes": true, 00:11:49.091 "flush": true, 00:11:49.091 "reset": true, 00:11:49.091 "compare": false, 00:11:49.091 "compare_and_write": false, 00:11:49.091 "abort": true, 00:11:49.091 "nvme_admin": false, 00:11:49.091 "nvme_io": false 00:11:49.091 }, 00:11:49.091 "memory_domains": [ 00:11:49.091 { 00:11:49.091 "dma_device_id": "system", 00:11:49.091 "dma_device_type": 1 00:11:49.091 }, 00:11:49.091 { 00:11:49.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:49.091 "dma_device_type": 2 00:11:49.091 } 00:11:49.091 ], 00:11:49.091 "driver_specific": {} 00:11:49.091 }' 00:11:49.091 04:12:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:49.091 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:49.091 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:49.091 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:49.091 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:49.349 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:49.349 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:49.350 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:49.608 [2024-05-15 04:12:37.478418] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:49.608 [2024-05-15 04:12:37.478447] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:49.608 [2024-05-15 04:12:37.478529] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:49.608 [2024-05-15 04:12:37.478591] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:49.608 [2024-05-15 04:12:37.478605] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b81f0 name Existed_Raid, state offline 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3848665 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3848665 ']' 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3848665 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3848665 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3848665' 00:11:49.608 killing process with pid 3848665 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3848665 00:11:49.608 [2024-05-15 04:12:37.517491] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:49.608 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3848665 00:11:49.608 [2024-05-15 04:12:37.552095] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:11:49.865 00:11:49.865 real 0m27.629s 00:11:49.865 user 0m51.544s 00:11:49.865 sys 0m3.784s 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:49.865 ************************************ 00:11:49.865 END TEST raid_state_function_test 00:11:49.865 ************************************ 00:11:49.865 04:12:37 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:11:49.865 04:12:37 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:11:49.865 04:12:37 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:11:49.865 04:12:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:49.865 ************************************ 00:11:49.865 START TEST raid_state_function_test_sb 00:11:49.865 ************************************ 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 3 true 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:11:49.865 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3852469 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3852469' 00:11:49.866 Process raid pid: 3852469 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3852469 /var/tmp/spdk-raid.sock 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3852469 ']' 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:49.866 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:11:49.866 04:12:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:50.123 [2024-05-15 04:12:37.916976] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:11:50.123 [2024-05-15 04:12:37.917045] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:50.123 [2024-05-15 04:12:37.998219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:50.123 [2024-05-15 04:12:38.118950] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:11:50.379 [2024-05-15 04:12:38.201056] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.379 [2024-05-15 04:12:38.201091] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.379 04:12:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:11:50.379 04:12:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:11:50.379 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:50.636 [2024-05-15 04:12:38.491670] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:50.636 [2024-05-15 04:12:38.491707] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:50.636 [2024-05-15 04:12:38.491717] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:50.636 [2024-05-15 04:12:38.491728] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:50.636 [2024-05-15 04:12:38.491735] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:50.636 [2024-05-15 04:12:38.491746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.636 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:50.893 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:50.893 "name": "Existed_Raid", 00:11:50.893 "uuid": "11bfc54a-c7d5-4055-8f37-6efb6c078e92", 00:11:50.893 "strip_size_kb": 64, 00:11:50.893 "state": "configuring", 00:11:50.893 "raid_level": "concat", 00:11:50.893 "superblock": true, 00:11:50.893 "num_base_bdevs": 3, 00:11:50.893 "num_base_bdevs_discovered": 0, 00:11:50.893 "num_base_bdevs_operational": 3, 00:11:50.893 "base_bdevs_list": [ 00:11:50.893 { 00:11:50.893 "name": "BaseBdev1", 00:11:50.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.893 "is_configured": false, 00:11:50.893 "data_offset": 0, 00:11:50.893 "data_size": 0 00:11:50.893 }, 00:11:50.893 { 00:11:50.893 "name": "BaseBdev2", 00:11:50.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.893 "is_configured": false, 00:11:50.893 "data_offset": 0, 00:11:50.893 "data_size": 0 00:11:50.893 }, 00:11:50.893 { 00:11:50.893 "name": "BaseBdev3", 00:11:50.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:50.893 "is_configured": false, 00:11:50.893 "data_offset": 0, 00:11:50.893 "data_size": 0 00:11:50.893 } 00:11:50.893 ] 00:11:50.893 }' 00:11:50.893 04:12:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:50.893 04:12:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:51.456 04:12:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:51.713 [2024-05-15 04:12:39.510290] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:51.713 [2024-05-15 04:12:39.510321] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0d020 name Existed_Raid, state configuring 00:11:51.713 04:12:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:51.971 [2024-05-15 04:12:39.746930] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:51.971 [2024-05-15 04:12:39.746965] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:51.971 [2024-05-15 04:12:39.746975] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:51.971 [2024-05-15 04:12:39.746986] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:51.971 [2024-05-15 04:12:39.746994] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:51.971 [2024-05-15 04:12:39.747003] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:51.971 04:12:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:52.228 [2024-05-15 04:12:40.039163] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:52.228 BaseBdev1 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:52.228 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:52.485 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:52.745 [ 00:11:52.745 { 00:11:52.745 "name": "BaseBdev1", 00:11:52.745 "aliases": [ 00:11:52.745 "8a10d88f-afa7-440c-888f-e05bc51cc4f9" 00:11:52.745 ], 00:11:52.745 "product_name": "Malloc disk", 00:11:52.745 "block_size": 512, 00:11:52.745 "num_blocks": 65536, 00:11:52.745 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:52.745 "assigned_rate_limits": { 00:11:52.745 "rw_ios_per_sec": 0, 00:11:52.745 "rw_mbytes_per_sec": 0, 00:11:52.745 "r_mbytes_per_sec": 0, 00:11:52.745 "w_mbytes_per_sec": 0 00:11:52.745 }, 00:11:52.745 "claimed": true, 00:11:52.745 "claim_type": "exclusive_write", 00:11:52.745 "zoned": false, 00:11:52.745 "supported_io_types": { 00:11:52.745 "read": true, 00:11:52.745 "write": true, 00:11:52.745 "unmap": true, 00:11:52.745 "write_zeroes": true, 00:11:52.745 "flush": true, 00:11:52.745 "reset": true, 00:11:52.745 "compare": false, 00:11:52.745 "compare_and_write": false, 00:11:52.745 "abort": true, 00:11:52.745 "nvme_admin": false, 00:11:52.745 "nvme_io": false 00:11:52.745 }, 00:11:52.745 "memory_domains": [ 00:11:52.745 { 00:11:52.745 "dma_device_id": "system", 00:11:52.745 "dma_device_type": 1 00:11:52.745 }, 00:11:52.745 { 00:11:52.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:52.745 "dma_device_type": 2 00:11:52.745 } 00:11:52.745 ], 00:11:52.745 "driver_specific": {} 00:11:52.745 } 00:11:52.745 ] 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.745 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:53.037 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:53.037 "name": "Existed_Raid", 00:11:53.037 "uuid": "669254f8-a265-455a-816c-d76ef1c2d13f", 00:11:53.037 "strip_size_kb": 64, 00:11:53.037 "state": "configuring", 00:11:53.037 "raid_level": "concat", 00:11:53.037 "superblock": true, 00:11:53.037 "num_base_bdevs": 3, 00:11:53.037 "num_base_bdevs_discovered": 1, 00:11:53.037 "num_base_bdevs_operational": 3, 00:11:53.037 "base_bdevs_list": [ 00:11:53.037 { 00:11:53.037 "name": "BaseBdev1", 00:11:53.037 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:53.037 "is_configured": true, 00:11:53.037 "data_offset": 2048, 00:11:53.037 "data_size": 63488 00:11:53.037 }, 00:11:53.037 { 00:11:53.037 "name": "BaseBdev2", 00:11:53.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.037 "is_configured": false, 00:11:53.037 "data_offset": 0, 00:11:53.037 "data_size": 0 00:11:53.037 }, 00:11:53.037 { 00:11:53.037 "name": "BaseBdev3", 00:11:53.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:53.037 "is_configured": false, 00:11:53.037 "data_offset": 0, 00:11:53.037 "data_size": 0 00:11:53.037 } 00:11:53.037 ] 00:11:53.037 }' 00:11:53.037 04:12:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:53.037 04:12:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:53.601 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:53.601 [2024-05-15 04:12:41.579247] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:53.601 [2024-05-15 04:12:41.579295] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0c8f0 name Existed_Raid, state configuring 00:11:53.601 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:53.859 [2024-05-15 04:12:41.815923] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:53.859 [2024-05-15 04:12:41.817290] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:53.859 [2024-05-15 04:12:41.817320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:53.859 [2024-05-15 04:12:41.817331] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:53.859 [2024-05-15 04:12:41.817341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.859 04:12:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.116 04:12:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:54.116 "name": "Existed_Raid", 00:11:54.116 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:11:54.116 "strip_size_kb": 64, 00:11:54.116 "state": "configuring", 00:11:54.116 "raid_level": "concat", 00:11:54.116 "superblock": true, 00:11:54.116 "num_base_bdevs": 3, 00:11:54.116 "num_base_bdevs_discovered": 1, 00:11:54.116 "num_base_bdevs_operational": 3, 00:11:54.116 "base_bdevs_list": [ 00:11:54.116 { 00:11:54.116 "name": "BaseBdev1", 00:11:54.116 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:54.116 "is_configured": true, 00:11:54.116 "data_offset": 2048, 00:11:54.116 "data_size": 63488 00:11:54.116 }, 00:11:54.116 { 00:11:54.116 "name": "BaseBdev2", 00:11:54.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.116 "is_configured": false, 00:11:54.116 "data_offset": 0, 00:11:54.116 "data_size": 0 00:11:54.116 }, 00:11:54.116 { 00:11:54.116 "name": "BaseBdev3", 00:11:54.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.116 "is_configured": false, 00:11:54.116 "data_offset": 0, 00:11:54.116 "data_size": 0 00:11:54.116 } 00:11:54.116 ] 00:11:54.116 }' 00:11:54.116 04:12:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:54.116 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:54.680 04:12:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:54.937 [2024-05-15 04:12:42.836254] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:54.937 BaseBdev2 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:54.937 04:12:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:55.194 04:12:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:55.494 [ 00:11:55.494 { 00:11:55.494 "name": "BaseBdev2", 00:11:55.494 "aliases": [ 00:11:55.494 "498715a5-c962-4c12-958d-caec4775f3f6" 00:11:55.494 ], 00:11:55.494 "product_name": "Malloc disk", 00:11:55.494 "block_size": 512, 00:11:55.494 "num_blocks": 65536, 00:11:55.494 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:11:55.494 "assigned_rate_limits": { 00:11:55.494 "rw_ios_per_sec": 0, 00:11:55.494 "rw_mbytes_per_sec": 0, 00:11:55.494 "r_mbytes_per_sec": 0, 00:11:55.494 "w_mbytes_per_sec": 0 00:11:55.494 }, 00:11:55.494 "claimed": true, 00:11:55.494 "claim_type": "exclusive_write", 00:11:55.494 "zoned": false, 00:11:55.494 "supported_io_types": { 00:11:55.494 "read": true, 00:11:55.494 "write": true, 00:11:55.494 "unmap": true, 00:11:55.494 "write_zeroes": true, 00:11:55.494 "flush": true, 00:11:55.494 "reset": true, 00:11:55.494 "compare": false, 00:11:55.494 "compare_and_write": false, 00:11:55.494 "abort": true, 00:11:55.494 "nvme_admin": false, 00:11:55.494 "nvme_io": false 00:11:55.494 }, 00:11:55.494 "memory_domains": [ 00:11:55.494 { 00:11:55.494 "dma_device_id": "system", 00:11:55.494 "dma_device_type": 1 00:11:55.494 }, 00:11:55.494 { 00:11:55.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.494 "dma_device_type": 2 00:11:55.494 } 00:11:55.494 ], 00:11:55.494 "driver_specific": {} 00:11:55.494 } 00:11:55.494 ] 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:55.494 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:55.495 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:55.495 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:55.495 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:55.495 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.495 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.751 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:55.751 "name": "Existed_Raid", 00:11:55.751 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:11:55.751 "strip_size_kb": 64, 00:11:55.751 "state": "configuring", 00:11:55.751 "raid_level": "concat", 00:11:55.751 "superblock": true, 00:11:55.751 "num_base_bdevs": 3, 00:11:55.751 "num_base_bdevs_discovered": 2, 00:11:55.751 "num_base_bdevs_operational": 3, 00:11:55.751 "base_bdevs_list": [ 00:11:55.752 { 00:11:55.752 "name": "BaseBdev1", 00:11:55.752 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:55.752 "is_configured": true, 00:11:55.752 "data_offset": 2048, 00:11:55.752 "data_size": 63488 00:11:55.752 }, 00:11:55.752 { 00:11:55.752 "name": "BaseBdev2", 00:11:55.752 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:11:55.752 "is_configured": true, 00:11:55.752 "data_offset": 2048, 00:11:55.752 "data_size": 63488 00:11:55.752 }, 00:11:55.752 { 00:11:55.752 "name": "BaseBdev3", 00:11:55.752 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.752 "is_configured": false, 00:11:55.752 "data_offset": 0, 00:11:55.752 "data_size": 0 00:11:55.752 } 00:11:55.752 ] 00:11:55.752 }' 00:11:55.752 04:12:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:55.752 04:12:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:56.317 04:12:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:11:56.575 [2024-05-15 04:12:44.554673] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:11:56.575 [2024-05-15 04:12:44.554932] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xe0d7e0 00:11:56.575 [2024-05-15 04:12:44.554952] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:11:56.575 [2024-05-15 04:12:44.555130] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe246d0 00:11:56.575 [2024-05-15 04:12:44.555283] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe0d7e0 00:11:56.575 [2024-05-15 04:12:44.555300] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe0d7e0 00:11:56.575 [2024-05-15 04:12:44.555407] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.575 BaseBdev3 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:11:56.575 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:57.140 04:12:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:11:57.140 [ 00:11:57.140 { 00:11:57.140 "name": "BaseBdev3", 00:11:57.140 "aliases": [ 00:11:57.140 "e7a9a835-9b05-4717-9d6f-b5141685feb0" 00:11:57.140 ], 00:11:57.140 "product_name": "Malloc disk", 00:11:57.140 "block_size": 512, 00:11:57.140 "num_blocks": 65536, 00:11:57.140 "uuid": "e7a9a835-9b05-4717-9d6f-b5141685feb0", 00:11:57.140 "assigned_rate_limits": { 00:11:57.140 "rw_ios_per_sec": 0, 00:11:57.140 "rw_mbytes_per_sec": 0, 00:11:57.140 "r_mbytes_per_sec": 0, 00:11:57.140 "w_mbytes_per_sec": 0 00:11:57.140 }, 00:11:57.140 "claimed": true, 00:11:57.140 "claim_type": "exclusive_write", 00:11:57.140 "zoned": false, 00:11:57.140 "supported_io_types": { 00:11:57.140 "read": true, 00:11:57.140 "write": true, 00:11:57.140 "unmap": true, 00:11:57.140 "write_zeroes": true, 00:11:57.140 "flush": true, 00:11:57.140 "reset": true, 00:11:57.140 "compare": false, 00:11:57.140 "compare_and_write": false, 00:11:57.140 "abort": true, 00:11:57.140 "nvme_admin": false, 00:11:57.140 "nvme_io": false 00:11:57.140 }, 00:11:57.140 "memory_domains": [ 00:11:57.140 { 00:11:57.140 "dma_device_id": "system", 00:11:57.140 "dma_device_type": 1 00:11:57.140 }, 00:11:57.140 { 00:11:57.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.140 "dma_device_type": 2 00:11:57.140 } 00:11:57.140 ], 00:11:57.140 "driver_specific": {} 00:11:57.140 } 00:11:57.140 ] 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:11:57.398 "name": "Existed_Raid", 00:11:57.398 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:11:57.398 "strip_size_kb": 64, 00:11:57.398 "state": "online", 00:11:57.398 "raid_level": "concat", 00:11:57.398 "superblock": true, 00:11:57.398 "num_base_bdevs": 3, 00:11:57.398 "num_base_bdevs_discovered": 3, 00:11:57.398 "num_base_bdevs_operational": 3, 00:11:57.398 "base_bdevs_list": [ 00:11:57.398 { 00:11:57.398 "name": "BaseBdev1", 00:11:57.398 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:57.398 "is_configured": true, 00:11:57.398 "data_offset": 2048, 00:11:57.398 "data_size": 63488 00:11:57.398 }, 00:11:57.398 { 00:11:57.398 "name": "BaseBdev2", 00:11:57.398 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:11:57.398 "is_configured": true, 00:11:57.398 "data_offset": 2048, 00:11:57.398 "data_size": 63488 00:11:57.398 }, 00:11:57.398 { 00:11:57.398 "name": "BaseBdev3", 00:11:57.398 "uuid": "e7a9a835-9b05-4717-9d6f-b5141685feb0", 00:11:57.398 "is_configured": true, 00:11:57.398 "data_offset": 2048, 00:11:57.398 "data_size": 63488 00:11:57.398 } 00:11:57.398 ] 00:11:57.398 }' 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:11:57.398 04:12:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:57.964 04:12:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:11:58.222 [2024-05-15 04:12:46.171324] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:11:58.222 "name": "Existed_Raid", 00:11:58.222 "aliases": [ 00:11:58.222 "1a6fac3d-550e-45e3-af4a-fcc19dff53b4" 00:11:58.222 ], 00:11:58.222 "product_name": "Raid Volume", 00:11:58.222 "block_size": 512, 00:11:58.222 "num_blocks": 190464, 00:11:58.222 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:11:58.222 "assigned_rate_limits": { 00:11:58.222 "rw_ios_per_sec": 0, 00:11:58.222 "rw_mbytes_per_sec": 0, 00:11:58.222 "r_mbytes_per_sec": 0, 00:11:58.222 "w_mbytes_per_sec": 0 00:11:58.222 }, 00:11:58.222 "claimed": false, 00:11:58.222 "zoned": false, 00:11:58.222 "supported_io_types": { 00:11:58.222 "read": true, 00:11:58.222 "write": true, 00:11:58.222 "unmap": true, 00:11:58.222 "write_zeroes": true, 00:11:58.222 "flush": true, 00:11:58.222 "reset": true, 00:11:58.222 "compare": false, 00:11:58.222 "compare_and_write": false, 00:11:58.222 "abort": false, 00:11:58.222 "nvme_admin": false, 00:11:58.222 "nvme_io": false 00:11:58.222 }, 00:11:58.222 "memory_domains": [ 00:11:58.222 { 00:11:58.222 "dma_device_id": "system", 00:11:58.222 "dma_device_type": 1 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.222 "dma_device_type": 2 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "dma_device_id": "system", 00:11:58.222 "dma_device_type": 1 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.222 "dma_device_type": 2 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "dma_device_id": "system", 00:11:58.222 "dma_device_type": 1 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.222 "dma_device_type": 2 00:11:58.222 } 00:11:58.222 ], 00:11:58.222 "driver_specific": { 00:11:58.222 "raid": { 00:11:58.222 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:11:58.222 "strip_size_kb": 64, 00:11:58.222 "state": "online", 00:11:58.222 "raid_level": "concat", 00:11:58.222 "superblock": true, 00:11:58.222 "num_base_bdevs": 3, 00:11:58.222 "num_base_bdevs_discovered": 3, 00:11:58.222 "num_base_bdevs_operational": 3, 00:11:58.222 "base_bdevs_list": [ 00:11:58.222 { 00:11:58.222 "name": "BaseBdev1", 00:11:58.222 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:58.222 "is_configured": true, 00:11:58.222 "data_offset": 2048, 00:11:58.222 "data_size": 63488 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "name": "BaseBdev2", 00:11:58.222 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:11:58.222 "is_configured": true, 00:11:58.222 "data_offset": 2048, 00:11:58.222 "data_size": 63488 00:11:58.222 }, 00:11:58.222 { 00:11:58.222 "name": "BaseBdev3", 00:11:58.222 "uuid": "e7a9a835-9b05-4717-9d6f-b5141685feb0", 00:11:58.222 "is_configured": true, 00:11:58.222 "data_offset": 2048, 00:11:58.222 "data_size": 63488 00:11:58.222 } 00:11:58.222 ] 00:11:58.222 } 00:11:58.222 } 00:11:58.222 }' 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:11:58.222 BaseBdev2 00:11:58.222 BaseBdev3' 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:58.222 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:58.480 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:58.480 "name": "BaseBdev1", 00:11:58.480 "aliases": [ 00:11:58.480 "8a10d88f-afa7-440c-888f-e05bc51cc4f9" 00:11:58.480 ], 00:11:58.480 "product_name": "Malloc disk", 00:11:58.480 "block_size": 512, 00:11:58.480 "num_blocks": 65536, 00:11:58.480 "uuid": "8a10d88f-afa7-440c-888f-e05bc51cc4f9", 00:11:58.480 "assigned_rate_limits": { 00:11:58.480 "rw_ios_per_sec": 0, 00:11:58.480 "rw_mbytes_per_sec": 0, 00:11:58.480 "r_mbytes_per_sec": 0, 00:11:58.480 "w_mbytes_per_sec": 0 00:11:58.480 }, 00:11:58.480 "claimed": true, 00:11:58.480 "claim_type": "exclusive_write", 00:11:58.480 "zoned": false, 00:11:58.480 "supported_io_types": { 00:11:58.480 "read": true, 00:11:58.480 "write": true, 00:11:58.480 "unmap": true, 00:11:58.480 "write_zeroes": true, 00:11:58.480 "flush": true, 00:11:58.480 "reset": true, 00:11:58.480 "compare": false, 00:11:58.480 "compare_and_write": false, 00:11:58.480 "abort": true, 00:11:58.480 "nvme_admin": false, 00:11:58.480 "nvme_io": false 00:11:58.480 }, 00:11:58.480 "memory_domains": [ 00:11:58.480 { 00:11:58.480 "dma_device_id": "system", 00:11:58.480 "dma_device_type": 1 00:11:58.480 }, 00:11:58.480 { 00:11:58.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.480 "dma_device_type": 2 00:11:58.480 } 00:11:58.480 ], 00:11:58.480 "driver_specific": {} 00:11:58.480 }' 00:11:58.480 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:58.738 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:58.997 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:58.997 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:58.997 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:58.997 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:58.997 04:12:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:59.255 "name": "BaseBdev2", 00:11:59.255 "aliases": [ 00:11:59.255 "498715a5-c962-4c12-958d-caec4775f3f6" 00:11:59.255 ], 00:11:59.255 "product_name": "Malloc disk", 00:11:59.255 "block_size": 512, 00:11:59.255 "num_blocks": 65536, 00:11:59.255 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:11:59.255 "assigned_rate_limits": { 00:11:59.255 "rw_ios_per_sec": 0, 00:11:59.255 "rw_mbytes_per_sec": 0, 00:11:59.255 "r_mbytes_per_sec": 0, 00:11:59.255 "w_mbytes_per_sec": 0 00:11:59.255 }, 00:11:59.255 "claimed": true, 00:11:59.255 "claim_type": "exclusive_write", 00:11:59.255 "zoned": false, 00:11:59.255 "supported_io_types": { 00:11:59.255 "read": true, 00:11:59.255 "write": true, 00:11:59.255 "unmap": true, 00:11:59.255 "write_zeroes": true, 00:11:59.255 "flush": true, 00:11:59.255 "reset": true, 00:11:59.255 "compare": false, 00:11:59.255 "compare_and_write": false, 00:11:59.255 "abort": true, 00:11:59.255 "nvme_admin": false, 00:11:59.255 "nvme_io": false 00:11:59.255 }, 00:11:59.255 "memory_domains": [ 00:11:59.255 { 00:11:59.255 "dma_device_id": "system", 00:11:59.255 "dma_device_type": 1 00:11:59.255 }, 00:11:59.255 { 00:11:59.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.255 "dma_device_type": 2 00:11:59.255 } 00:11:59.255 ], 00:11:59.255 "driver_specific": {} 00:11:59.255 }' 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.255 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:59.513 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:11:59.513 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:11:59.513 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:11:59.513 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:11:59.513 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:11:59.771 "name": "BaseBdev3", 00:11:59.771 "aliases": [ 00:11:59.771 "e7a9a835-9b05-4717-9d6f-b5141685feb0" 00:11:59.771 ], 00:11:59.771 "product_name": "Malloc disk", 00:11:59.771 "block_size": 512, 00:11:59.771 "num_blocks": 65536, 00:11:59.771 "uuid": "e7a9a835-9b05-4717-9d6f-b5141685feb0", 00:11:59.771 "assigned_rate_limits": { 00:11:59.771 "rw_ios_per_sec": 0, 00:11:59.771 "rw_mbytes_per_sec": 0, 00:11:59.771 "r_mbytes_per_sec": 0, 00:11:59.771 "w_mbytes_per_sec": 0 00:11:59.771 }, 00:11:59.771 "claimed": true, 00:11:59.771 "claim_type": "exclusive_write", 00:11:59.771 "zoned": false, 00:11:59.771 "supported_io_types": { 00:11:59.771 "read": true, 00:11:59.771 "write": true, 00:11:59.771 "unmap": true, 00:11:59.771 "write_zeroes": true, 00:11:59.771 "flush": true, 00:11:59.771 "reset": true, 00:11:59.771 "compare": false, 00:11:59.771 "compare_and_write": false, 00:11:59.771 "abort": true, 00:11:59.771 "nvme_admin": false, 00:11:59.771 "nvme_io": false 00:11:59.771 }, 00:11:59.771 "memory_domains": [ 00:11:59.771 { 00:11:59.771 "dma_device_id": "system", 00:11:59.771 "dma_device_type": 1 00:11:59.771 }, 00:11:59.771 { 00:11:59.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.771 "dma_device_type": 2 00:11:59.771 } 00:11:59.771 ], 00:11:59.771 "driver_specific": {} 00:11:59.771 }' 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.771 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:00.029 04:12:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:00.287 [2024-05-15 04:12:48.140415] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:00.287 [2024-05-15 04:12:48.140442] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:00.287 [2024-05-15 04:12:48.140485] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:00.287 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:00.288 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.288 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.546 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:00.546 "name": "Existed_Raid", 00:12:00.546 "uuid": "1a6fac3d-550e-45e3-af4a-fcc19dff53b4", 00:12:00.546 "strip_size_kb": 64, 00:12:00.546 "state": "offline", 00:12:00.546 "raid_level": "concat", 00:12:00.546 "superblock": true, 00:12:00.546 "num_base_bdevs": 3, 00:12:00.546 "num_base_bdevs_discovered": 2, 00:12:00.546 "num_base_bdevs_operational": 2, 00:12:00.546 "base_bdevs_list": [ 00:12:00.546 { 00:12:00.546 "name": null, 00:12:00.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.546 "is_configured": false, 00:12:00.546 "data_offset": 2048, 00:12:00.546 "data_size": 63488 00:12:00.546 }, 00:12:00.546 { 00:12:00.546 "name": "BaseBdev2", 00:12:00.546 "uuid": "498715a5-c962-4c12-958d-caec4775f3f6", 00:12:00.546 "is_configured": true, 00:12:00.546 "data_offset": 2048, 00:12:00.546 "data_size": 63488 00:12:00.546 }, 00:12:00.546 { 00:12:00.546 "name": "BaseBdev3", 00:12:00.546 "uuid": "e7a9a835-9b05-4717-9d6f-b5141685feb0", 00:12:00.546 "is_configured": true, 00:12:00.546 "data_offset": 2048, 00:12:00.546 "data_size": 63488 00:12:00.546 } 00:12:00.546 ] 00:12:00.546 }' 00:12:00.546 04:12:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:00.546 04:12:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:01.111 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:01.111 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:01.111 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.111 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:01.369 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:01.369 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:01.369 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:01.630 [2024-05-15 04:12:49.477783] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:01.630 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:01.630 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:01.630 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.630 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:01.888 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:01.888 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:01.888 04:12:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:02.146 [2024-05-15 04:12:50.015949] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:02.146 [2024-05-15 04:12:50.016005] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0d7e0 name Existed_Raid, state offline 00:12:02.146 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:02.146 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:02.146 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:02.146 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:02.403 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:02.661 BaseBdev2 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:02.661 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:02.919 04:12:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:03.177 [ 00:12:03.177 { 00:12:03.177 "name": "BaseBdev2", 00:12:03.177 "aliases": [ 00:12:03.177 "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd" 00:12:03.177 ], 00:12:03.177 "product_name": "Malloc disk", 00:12:03.177 "block_size": 512, 00:12:03.177 "num_blocks": 65536, 00:12:03.177 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:03.177 "assigned_rate_limits": { 00:12:03.177 "rw_ios_per_sec": 0, 00:12:03.177 "rw_mbytes_per_sec": 0, 00:12:03.177 "r_mbytes_per_sec": 0, 00:12:03.177 "w_mbytes_per_sec": 0 00:12:03.177 }, 00:12:03.177 "claimed": false, 00:12:03.177 "zoned": false, 00:12:03.177 "supported_io_types": { 00:12:03.177 "read": true, 00:12:03.177 "write": true, 00:12:03.177 "unmap": true, 00:12:03.177 "write_zeroes": true, 00:12:03.177 "flush": true, 00:12:03.177 "reset": true, 00:12:03.177 "compare": false, 00:12:03.177 "compare_and_write": false, 00:12:03.177 "abort": true, 00:12:03.177 "nvme_admin": false, 00:12:03.177 "nvme_io": false 00:12:03.177 }, 00:12:03.177 "memory_domains": [ 00:12:03.177 { 00:12:03.177 "dma_device_id": "system", 00:12:03.177 "dma_device_type": 1 00:12:03.177 }, 00:12:03.177 { 00:12:03.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.177 "dma_device_type": 2 00:12:03.177 } 00:12:03.177 ], 00:12:03.177 "driver_specific": {} 00:12:03.177 } 00:12:03.177 ] 00:12:03.177 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:03.177 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:03.177 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:03.177 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:03.435 BaseBdev3 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:03.435 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:03.692 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:03.950 [ 00:12:03.950 { 00:12:03.950 "name": "BaseBdev3", 00:12:03.950 "aliases": [ 00:12:03.950 "58706f99-d4ce-487c-a7b3-408304c96c64" 00:12:03.950 ], 00:12:03.950 "product_name": "Malloc disk", 00:12:03.950 "block_size": 512, 00:12:03.950 "num_blocks": 65536, 00:12:03.950 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:03.950 "assigned_rate_limits": { 00:12:03.950 "rw_ios_per_sec": 0, 00:12:03.950 "rw_mbytes_per_sec": 0, 00:12:03.950 "r_mbytes_per_sec": 0, 00:12:03.950 "w_mbytes_per_sec": 0 00:12:03.950 }, 00:12:03.950 "claimed": false, 00:12:03.951 "zoned": false, 00:12:03.951 "supported_io_types": { 00:12:03.951 "read": true, 00:12:03.951 "write": true, 00:12:03.951 "unmap": true, 00:12:03.951 "write_zeroes": true, 00:12:03.951 "flush": true, 00:12:03.951 "reset": true, 00:12:03.951 "compare": false, 00:12:03.951 "compare_and_write": false, 00:12:03.951 "abort": true, 00:12:03.951 "nvme_admin": false, 00:12:03.951 "nvme_io": false 00:12:03.951 }, 00:12:03.951 "memory_domains": [ 00:12:03.951 { 00:12:03.951 "dma_device_id": "system", 00:12:03.951 "dma_device_type": 1 00:12:03.951 }, 00:12:03.951 { 00:12:03.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.951 "dma_device_type": 2 00:12:03.951 } 00:12:03.951 ], 00:12:03.951 "driver_specific": {} 00:12:03.951 } 00:12:03.951 ] 00:12:03.951 04:12:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:03.951 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:03.951 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:03.951 04:12:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:04.209 [2024-05-15 04:12:52.106494] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.209 [2024-05-15 04:12:52.106535] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.209 [2024-05-15 04:12:52.106559] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:04.209 [2024-05-15 04:12:52.107861] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.209 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.466 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:04.467 "name": "Existed_Raid", 00:12:04.467 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:04.467 "strip_size_kb": 64, 00:12:04.467 "state": "configuring", 00:12:04.467 "raid_level": "concat", 00:12:04.467 "superblock": true, 00:12:04.467 "num_base_bdevs": 3, 00:12:04.467 "num_base_bdevs_discovered": 2, 00:12:04.467 "num_base_bdevs_operational": 3, 00:12:04.467 "base_bdevs_list": [ 00:12:04.467 { 00:12:04.467 "name": "BaseBdev1", 00:12:04.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.467 "is_configured": false, 00:12:04.467 "data_offset": 0, 00:12:04.467 "data_size": 0 00:12:04.467 }, 00:12:04.467 { 00:12:04.467 "name": "BaseBdev2", 00:12:04.467 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:04.467 "is_configured": true, 00:12:04.467 "data_offset": 2048, 00:12:04.467 "data_size": 63488 00:12:04.467 }, 00:12:04.467 { 00:12:04.467 "name": "BaseBdev3", 00:12:04.467 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:04.467 "is_configured": true, 00:12:04.467 "data_offset": 2048, 00:12:04.467 "data_size": 63488 00:12:04.467 } 00:12:04.467 ] 00:12:04.467 }' 00:12:04.467 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:04.467 04:12:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:05.032 04:12:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:05.288 [2024-05-15 04:12:53.149274] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.289 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.547 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:05.547 "name": "Existed_Raid", 00:12:05.547 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:05.547 "strip_size_kb": 64, 00:12:05.547 "state": "configuring", 00:12:05.547 "raid_level": "concat", 00:12:05.547 "superblock": true, 00:12:05.547 "num_base_bdevs": 3, 00:12:05.547 "num_base_bdevs_discovered": 1, 00:12:05.547 "num_base_bdevs_operational": 3, 00:12:05.547 "base_bdevs_list": [ 00:12:05.547 { 00:12:05.547 "name": "BaseBdev1", 00:12:05.547 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.547 "is_configured": false, 00:12:05.547 "data_offset": 0, 00:12:05.547 "data_size": 0 00:12:05.547 }, 00:12:05.547 { 00:12:05.547 "name": null, 00:12:05.547 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:05.547 "is_configured": false, 00:12:05.547 "data_offset": 2048, 00:12:05.547 "data_size": 63488 00:12:05.547 }, 00:12:05.547 { 00:12:05.547 "name": "BaseBdev3", 00:12:05.547 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:05.547 "is_configured": true, 00:12:05.547 "data_offset": 2048, 00:12:05.547 "data_size": 63488 00:12:05.547 } 00:12:05.547 ] 00:12:05.547 }' 00:12:05.547 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:05.547 04:12:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:06.113 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.113 04:12:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:06.371 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:12:06.371 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:06.629 [2024-05-15 04:12:54.446981] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:06.629 BaseBdev1 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:06.629 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:06.887 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:07.145 [ 00:12:07.145 { 00:12:07.145 "name": "BaseBdev1", 00:12:07.145 "aliases": [ 00:12:07.145 "4300513a-2700-4cd7-80fc-d312f6002e79" 00:12:07.145 ], 00:12:07.145 "product_name": "Malloc disk", 00:12:07.145 "block_size": 512, 00:12:07.145 "num_blocks": 65536, 00:12:07.145 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:07.145 "assigned_rate_limits": { 00:12:07.145 "rw_ios_per_sec": 0, 00:12:07.145 "rw_mbytes_per_sec": 0, 00:12:07.145 "r_mbytes_per_sec": 0, 00:12:07.145 "w_mbytes_per_sec": 0 00:12:07.145 }, 00:12:07.145 "claimed": true, 00:12:07.145 "claim_type": "exclusive_write", 00:12:07.145 "zoned": false, 00:12:07.145 "supported_io_types": { 00:12:07.145 "read": true, 00:12:07.145 "write": true, 00:12:07.145 "unmap": true, 00:12:07.145 "write_zeroes": true, 00:12:07.145 "flush": true, 00:12:07.145 "reset": true, 00:12:07.145 "compare": false, 00:12:07.145 "compare_and_write": false, 00:12:07.145 "abort": true, 00:12:07.145 "nvme_admin": false, 00:12:07.145 "nvme_io": false 00:12:07.145 }, 00:12:07.145 "memory_domains": [ 00:12:07.145 { 00:12:07.145 "dma_device_id": "system", 00:12:07.145 "dma_device_type": 1 00:12:07.145 }, 00:12:07.145 { 00:12:07.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.145 "dma_device_type": 2 00:12:07.145 } 00:12:07.145 ], 00:12:07.145 "driver_specific": {} 00:12:07.145 } 00:12:07.145 ] 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.145 04:12:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.403 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:07.403 "name": "Existed_Raid", 00:12:07.403 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:07.403 "strip_size_kb": 64, 00:12:07.403 "state": "configuring", 00:12:07.403 "raid_level": "concat", 00:12:07.403 "superblock": true, 00:12:07.403 "num_base_bdevs": 3, 00:12:07.403 "num_base_bdevs_discovered": 2, 00:12:07.403 "num_base_bdevs_operational": 3, 00:12:07.403 "base_bdevs_list": [ 00:12:07.403 { 00:12:07.403 "name": "BaseBdev1", 00:12:07.403 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:07.403 "is_configured": true, 00:12:07.403 "data_offset": 2048, 00:12:07.403 "data_size": 63488 00:12:07.403 }, 00:12:07.403 { 00:12:07.403 "name": null, 00:12:07.403 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:07.403 "is_configured": false, 00:12:07.403 "data_offset": 2048, 00:12:07.403 "data_size": 63488 00:12:07.403 }, 00:12:07.403 { 00:12:07.403 "name": "BaseBdev3", 00:12:07.403 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:07.403 "is_configured": true, 00:12:07.403 "data_offset": 2048, 00:12:07.403 "data_size": 63488 00:12:07.403 } 00:12:07.403 ] 00:12:07.403 }' 00:12:07.403 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:07.403 04:12:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.966 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.966 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:08.224 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:12:08.224 04:12:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:08.224 [2024-05-15 04:12:56.227685] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:08.482 "name": "Existed_Raid", 00:12:08.482 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:08.482 "strip_size_kb": 64, 00:12:08.482 "state": "configuring", 00:12:08.482 "raid_level": "concat", 00:12:08.482 "superblock": true, 00:12:08.482 "num_base_bdevs": 3, 00:12:08.482 "num_base_bdevs_discovered": 1, 00:12:08.482 "num_base_bdevs_operational": 3, 00:12:08.482 "base_bdevs_list": [ 00:12:08.482 { 00:12:08.482 "name": "BaseBdev1", 00:12:08.482 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:08.482 "is_configured": true, 00:12:08.482 "data_offset": 2048, 00:12:08.482 "data_size": 63488 00:12:08.482 }, 00:12:08.482 { 00:12:08.482 "name": null, 00:12:08.482 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:08.482 "is_configured": false, 00:12:08.482 "data_offset": 2048, 00:12:08.482 "data_size": 63488 00:12:08.482 }, 00:12:08.482 { 00:12:08.482 "name": null, 00:12:08.482 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:08.482 "is_configured": false, 00:12:08.482 "data_offset": 2048, 00:12:08.482 "data_size": 63488 00:12:08.482 } 00:12:08.482 ] 00:12:08.482 }' 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:08.482 04:12:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:09.047 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.047 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:09.305 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:12:09.305 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:09.871 [2024-05-15 04:12:57.579310] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:09.871 "name": "Existed_Raid", 00:12:09.871 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:09.871 "strip_size_kb": 64, 00:12:09.871 "state": "configuring", 00:12:09.871 "raid_level": "concat", 00:12:09.871 "superblock": true, 00:12:09.871 "num_base_bdevs": 3, 00:12:09.871 "num_base_bdevs_discovered": 2, 00:12:09.871 "num_base_bdevs_operational": 3, 00:12:09.871 "base_bdevs_list": [ 00:12:09.871 { 00:12:09.871 "name": "BaseBdev1", 00:12:09.871 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:09.871 "is_configured": true, 00:12:09.871 "data_offset": 2048, 00:12:09.871 "data_size": 63488 00:12:09.871 }, 00:12:09.871 { 00:12:09.871 "name": null, 00:12:09.871 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:09.871 "is_configured": false, 00:12:09.871 "data_offset": 2048, 00:12:09.871 "data_size": 63488 00:12:09.871 }, 00:12:09.871 { 00:12:09.871 "name": "BaseBdev3", 00:12:09.871 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:09.871 "is_configured": true, 00:12:09.871 "data_offset": 2048, 00:12:09.871 "data_size": 63488 00:12:09.871 } 00:12:09.871 ] 00:12:09.871 }' 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:09.871 04:12:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.437 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.437 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:10.695 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:12:10.695 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:10.953 [2024-05-15 04:12:58.910878] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:10.953 04:12:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.211 04:12:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:11.211 "name": "Existed_Raid", 00:12:11.211 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:11.211 "strip_size_kb": 64, 00:12:11.211 "state": "configuring", 00:12:11.211 "raid_level": "concat", 00:12:11.211 "superblock": true, 00:12:11.211 "num_base_bdevs": 3, 00:12:11.211 "num_base_bdevs_discovered": 1, 00:12:11.211 "num_base_bdevs_operational": 3, 00:12:11.211 "base_bdevs_list": [ 00:12:11.211 { 00:12:11.211 "name": null, 00:12:11.211 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:11.211 "is_configured": false, 00:12:11.211 "data_offset": 2048, 00:12:11.211 "data_size": 63488 00:12:11.211 }, 00:12:11.211 { 00:12:11.211 "name": null, 00:12:11.211 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:11.211 "is_configured": false, 00:12:11.211 "data_offset": 2048, 00:12:11.211 "data_size": 63488 00:12:11.211 }, 00:12:11.211 { 00:12:11.211 "name": "BaseBdev3", 00:12:11.211 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:11.211 "is_configured": true, 00:12:11.211 "data_offset": 2048, 00:12:11.211 "data_size": 63488 00:12:11.211 } 00:12:11.211 ] 00:12:11.211 }' 00:12:11.211 04:12:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:11.211 04:12:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:11.776 04:12:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.776 04:12:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:12.034 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:12:12.034 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:12.292 [2024-05-15 04:13:00.279821] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.292 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:12.550 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:12.550 "name": "Existed_Raid", 00:12:12.550 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:12.550 "strip_size_kb": 64, 00:12:12.550 "state": "configuring", 00:12:12.550 "raid_level": "concat", 00:12:12.550 "superblock": true, 00:12:12.550 "num_base_bdevs": 3, 00:12:12.550 "num_base_bdevs_discovered": 2, 00:12:12.550 "num_base_bdevs_operational": 3, 00:12:12.550 "base_bdevs_list": [ 00:12:12.550 { 00:12:12.550 "name": null, 00:12:12.550 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:12.550 "is_configured": false, 00:12:12.550 "data_offset": 2048, 00:12:12.550 "data_size": 63488 00:12:12.550 }, 00:12:12.550 { 00:12:12.550 "name": "BaseBdev2", 00:12:12.550 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:12.550 "is_configured": true, 00:12:12.550 "data_offset": 2048, 00:12:12.550 "data_size": 63488 00:12:12.550 }, 00:12:12.550 { 00:12:12.550 "name": "BaseBdev3", 00:12:12.550 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:12.550 "is_configured": true, 00:12:12.550 "data_offset": 2048, 00:12:12.550 "data_size": 63488 00:12:12.550 } 00:12:12.550 ] 00:12:12.550 }' 00:12:12.550 04:13:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:12.550 04:13:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:13.116 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.116 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:13.373 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:12:13.373 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.373 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:13.631 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4300513a-2700-4cd7-80fc-d312f6002e79 00:12:13.888 [2024-05-15 04:13:01.846267] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:13.888 [2024-05-15 04:13:01.846500] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb14a0 00:12:13.888 [2024-05-15 04:13:01.846519] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:13.888 [2024-05-15 04:13:01.846705] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0c2e0 00:12:13.888 [2024-05-15 04:13:01.846860] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb14a0 00:12:13.888 [2024-05-15 04:13:01.846877] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfb14a0 00:12:13.888 [2024-05-15 04:13:01.847010] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:13.888 NewBaseBdev 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:13.888 04:13:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:14.147 04:13:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:14.405 [ 00:12:14.405 { 00:12:14.405 "name": "NewBaseBdev", 00:12:14.405 "aliases": [ 00:12:14.405 "4300513a-2700-4cd7-80fc-d312f6002e79" 00:12:14.405 ], 00:12:14.405 "product_name": "Malloc disk", 00:12:14.405 "block_size": 512, 00:12:14.405 "num_blocks": 65536, 00:12:14.405 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:14.405 "assigned_rate_limits": { 00:12:14.405 "rw_ios_per_sec": 0, 00:12:14.405 "rw_mbytes_per_sec": 0, 00:12:14.405 "r_mbytes_per_sec": 0, 00:12:14.405 "w_mbytes_per_sec": 0 00:12:14.405 }, 00:12:14.405 "claimed": true, 00:12:14.405 "claim_type": "exclusive_write", 00:12:14.405 "zoned": false, 00:12:14.405 "supported_io_types": { 00:12:14.405 "read": true, 00:12:14.405 "write": true, 00:12:14.405 "unmap": true, 00:12:14.405 "write_zeroes": true, 00:12:14.405 "flush": true, 00:12:14.405 "reset": true, 00:12:14.405 "compare": false, 00:12:14.405 "compare_and_write": false, 00:12:14.405 "abort": true, 00:12:14.405 "nvme_admin": false, 00:12:14.405 "nvme_io": false 00:12:14.405 }, 00:12:14.405 "memory_domains": [ 00:12:14.405 { 00:12:14.405 "dma_device_id": "system", 00:12:14.405 "dma_device_type": 1 00:12:14.405 }, 00:12:14.405 { 00:12:14.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.405 "dma_device_type": 2 00:12:14.405 } 00:12:14.405 ], 00:12:14.405 "driver_specific": {} 00:12:14.405 } 00:12:14.405 ] 00:12:14.405 04:13:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:12:14.405 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:12:14.405 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:14.405 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:14.663 "name": "Existed_Raid", 00:12:14.663 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:14.663 "strip_size_kb": 64, 00:12:14.663 "state": "online", 00:12:14.663 "raid_level": "concat", 00:12:14.663 "superblock": true, 00:12:14.663 "num_base_bdevs": 3, 00:12:14.663 "num_base_bdevs_discovered": 3, 00:12:14.663 "num_base_bdevs_operational": 3, 00:12:14.663 "base_bdevs_list": [ 00:12:14.663 { 00:12:14.663 "name": "NewBaseBdev", 00:12:14.663 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:14.663 "is_configured": true, 00:12:14.663 "data_offset": 2048, 00:12:14.663 "data_size": 63488 00:12:14.663 }, 00:12:14.663 { 00:12:14.663 "name": "BaseBdev2", 00:12:14.663 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:14.663 "is_configured": true, 00:12:14.663 "data_offset": 2048, 00:12:14.663 "data_size": 63488 00:12:14.663 }, 00:12:14.663 { 00:12:14.663 "name": "BaseBdev3", 00:12:14.663 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:14.663 "is_configured": true, 00:12:14.663 "data_offset": 2048, 00:12:14.663 "data_size": 63488 00:12:14.663 } 00:12:14.663 ] 00:12:14.663 }' 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:14.663 04:13:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:15.229 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:15.493 [2024-05-15 04:13:03.490898] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:15.753 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:15.753 "name": "Existed_Raid", 00:12:15.753 "aliases": [ 00:12:15.753 "b389de92-94a4-4112-bd36-ad3045ed4d6a" 00:12:15.753 ], 00:12:15.753 "product_name": "Raid Volume", 00:12:15.753 "block_size": 512, 00:12:15.753 "num_blocks": 190464, 00:12:15.753 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:15.753 "assigned_rate_limits": { 00:12:15.754 "rw_ios_per_sec": 0, 00:12:15.754 "rw_mbytes_per_sec": 0, 00:12:15.754 "r_mbytes_per_sec": 0, 00:12:15.754 "w_mbytes_per_sec": 0 00:12:15.754 }, 00:12:15.754 "claimed": false, 00:12:15.754 "zoned": false, 00:12:15.754 "supported_io_types": { 00:12:15.754 "read": true, 00:12:15.754 "write": true, 00:12:15.754 "unmap": true, 00:12:15.754 "write_zeroes": true, 00:12:15.754 "flush": true, 00:12:15.754 "reset": true, 00:12:15.754 "compare": false, 00:12:15.754 "compare_and_write": false, 00:12:15.754 "abort": false, 00:12:15.754 "nvme_admin": false, 00:12:15.754 "nvme_io": false 00:12:15.754 }, 00:12:15.754 "memory_domains": [ 00:12:15.754 { 00:12:15.754 "dma_device_id": "system", 00:12:15.754 "dma_device_type": 1 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.754 "dma_device_type": 2 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "dma_device_id": "system", 00:12:15.754 "dma_device_type": 1 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.754 "dma_device_type": 2 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "dma_device_id": "system", 00:12:15.754 "dma_device_type": 1 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:15.754 "dma_device_type": 2 00:12:15.754 } 00:12:15.754 ], 00:12:15.754 "driver_specific": { 00:12:15.754 "raid": { 00:12:15.754 "uuid": "b389de92-94a4-4112-bd36-ad3045ed4d6a", 00:12:15.754 "strip_size_kb": 64, 00:12:15.754 "state": "online", 00:12:15.754 "raid_level": "concat", 00:12:15.754 "superblock": true, 00:12:15.754 "num_base_bdevs": 3, 00:12:15.754 "num_base_bdevs_discovered": 3, 00:12:15.754 "num_base_bdevs_operational": 3, 00:12:15.754 "base_bdevs_list": [ 00:12:15.754 { 00:12:15.754 "name": "NewBaseBdev", 00:12:15.754 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:15.754 "is_configured": true, 00:12:15.754 "data_offset": 2048, 00:12:15.754 "data_size": 63488 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "name": "BaseBdev2", 00:12:15.754 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:15.754 "is_configured": true, 00:12:15.754 "data_offset": 2048, 00:12:15.754 "data_size": 63488 00:12:15.754 }, 00:12:15.754 { 00:12:15.754 "name": "BaseBdev3", 00:12:15.754 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:15.754 "is_configured": true, 00:12:15.754 "data_offset": 2048, 00:12:15.754 "data_size": 63488 00:12:15.754 } 00:12:15.754 ] 00:12:15.754 } 00:12:15.754 } 00:12:15.754 }' 00:12:15.754 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:15.754 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:15.754 BaseBdev2 00:12:15.754 BaseBdev3' 00:12:15.754 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:15.754 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:15.754 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:16.012 "name": "NewBaseBdev", 00:12:16.012 "aliases": [ 00:12:16.012 "4300513a-2700-4cd7-80fc-d312f6002e79" 00:12:16.012 ], 00:12:16.012 "product_name": "Malloc disk", 00:12:16.012 "block_size": 512, 00:12:16.012 "num_blocks": 65536, 00:12:16.012 "uuid": "4300513a-2700-4cd7-80fc-d312f6002e79", 00:12:16.012 "assigned_rate_limits": { 00:12:16.012 "rw_ios_per_sec": 0, 00:12:16.012 "rw_mbytes_per_sec": 0, 00:12:16.012 "r_mbytes_per_sec": 0, 00:12:16.012 "w_mbytes_per_sec": 0 00:12:16.012 }, 00:12:16.012 "claimed": true, 00:12:16.012 "claim_type": "exclusive_write", 00:12:16.012 "zoned": false, 00:12:16.012 "supported_io_types": { 00:12:16.012 "read": true, 00:12:16.012 "write": true, 00:12:16.012 "unmap": true, 00:12:16.012 "write_zeroes": true, 00:12:16.012 "flush": true, 00:12:16.012 "reset": true, 00:12:16.012 "compare": false, 00:12:16.012 "compare_and_write": false, 00:12:16.012 "abort": true, 00:12:16.012 "nvme_admin": false, 00:12:16.012 "nvme_io": false 00:12:16.012 }, 00:12:16.012 "memory_domains": [ 00:12:16.012 { 00:12:16.012 "dma_device_id": "system", 00:12:16.012 "dma_device_type": 1 00:12:16.012 }, 00:12:16.012 { 00:12:16.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.012 "dma_device_type": 2 00:12:16.012 } 00:12:16.012 ], 00:12:16.012 "driver_specific": {} 00:12:16.012 }' 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:16.012 04:13:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:16.270 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:16.528 "name": "BaseBdev2", 00:12:16.528 "aliases": [ 00:12:16.528 "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd" 00:12:16.528 ], 00:12:16.528 "product_name": "Malloc disk", 00:12:16.528 "block_size": 512, 00:12:16.528 "num_blocks": 65536, 00:12:16.528 "uuid": "9f61a6aa-fb91-4dd3-a0ac-a750cccce1dd", 00:12:16.528 "assigned_rate_limits": { 00:12:16.528 "rw_ios_per_sec": 0, 00:12:16.528 "rw_mbytes_per_sec": 0, 00:12:16.528 "r_mbytes_per_sec": 0, 00:12:16.528 "w_mbytes_per_sec": 0 00:12:16.528 }, 00:12:16.528 "claimed": true, 00:12:16.528 "claim_type": "exclusive_write", 00:12:16.528 "zoned": false, 00:12:16.528 "supported_io_types": { 00:12:16.528 "read": true, 00:12:16.528 "write": true, 00:12:16.528 "unmap": true, 00:12:16.528 "write_zeroes": true, 00:12:16.528 "flush": true, 00:12:16.528 "reset": true, 00:12:16.528 "compare": false, 00:12:16.528 "compare_and_write": false, 00:12:16.528 "abort": true, 00:12:16.528 "nvme_admin": false, 00:12:16.528 "nvme_io": false 00:12:16.528 }, 00:12:16.528 "memory_domains": [ 00:12:16.528 { 00:12:16.528 "dma_device_id": "system", 00:12:16.528 "dma_device_type": 1 00:12:16.528 }, 00:12:16.528 { 00:12:16.528 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.528 "dma_device_type": 2 00:12:16.528 } 00:12:16.528 ], 00:12:16.528 "driver_specific": {} 00:12:16.528 }' 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:16.528 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:16.787 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:17.045 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:17.045 "name": "BaseBdev3", 00:12:17.045 "aliases": [ 00:12:17.045 "58706f99-d4ce-487c-a7b3-408304c96c64" 00:12:17.045 ], 00:12:17.045 "product_name": "Malloc disk", 00:12:17.045 "block_size": 512, 00:12:17.045 "num_blocks": 65536, 00:12:17.045 "uuid": "58706f99-d4ce-487c-a7b3-408304c96c64", 00:12:17.045 "assigned_rate_limits": { 00:12:17.045 "rw_ios_per_sec": 0, 00:12:17.045 "rw_mbytes_per_sec": 0, 00:12:17.045 "r_mbytes_per_sec": 0, 00:12:17.045 "w_mbytes_per_sec": 0 00:12:17.045 }, 00:12:17.045 "claimed": true, 00:12:17.045 "claim_type": "exclusive_write", 00:12:17.045 "zoned": false, 00:12:17.045 "supported_io_types": { 00:12:17.045 "read": true, 00:12:17.045 "write": true, 00:12:17.045 "unmap": true, 00:12:17.045 "write_zeroes": true, 00:12:17.045 "flush": true, 00:12:17.045 "reset": true, 00:12:17.045 "compare": false, 00:12:17.045 "compare_and_write": false, 00:12:17.045 "abort": true, 00:12:17.045 "nvme_admin": false, 00:12:17.045 "nvme_io": false 00:12:17.045 }, 00:12:17.045 "memory_domains": [ 00:12:17.045 { 00:12:17.045 "dma_device_id": "system", 00:12:17.045 "dma_device_type": 1 00:12:17.045 }, 00:12:17.045 { 00:12:17.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:17.045 "dma_device_type": 2 00:12:17.045 } 00:12:17.045 ], 00:12:17.045 "driver_specific": {} 00:12:17.045 }' 00:12:17.045 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:17.045 04:13:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:17.045 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:17.045 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:17.045 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:17.303 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:17.560 [2024-05-15 04:13:05.503941] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:17.560 [2024-05-15 04:13:05.503974] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:17.560 [2024-05-15 04:13:05.504059] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:17.560 [2024-05-15 04:13:05.504147] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:17.560 [2024-05-15 04:13:05.504164] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb14a0 name Existed_Raid, state offline 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3852469 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3852469 ']' 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3852469 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3852469 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3852469' 00:12:17.560 killing process with pid 3852469 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3852469 00:12:17.560 [2024-05-15 04:13:05.548108] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:17.560 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3852469 00:12:17.818 [2024-05-15 04:13:05.584979] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:18.076 04:13:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:12:18.076 00:12:18.076 real 0m28.008s 00:12:18.076 user 0m52.505s 00:12:18.076 sys 0m3.925s 00:12:18.076 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:18.076 04:13:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:18.076 ************************************ 00:12:18.076 END TEST raid_state_function_test_sb 00:12:18.076 ************************************ 00:12:18.076 04:13:05 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:12:18.076 04:13:05 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:12:18.076 04:13:05 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:18.076 04:13:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:18.076 ************************************ 00:12:18.076 START TEST raid_superblock_test 00:12:18.076 ************************************ 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 3 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3856394 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3856394 /var/tmp/spdk-raid.sock 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3856394 ']' 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:18.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:18.076 04:13:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.076 [2024-05-15 04:13:05.976350] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:12:18.076 [2024-05-15 04:13:05.976420] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3856394 ] 00:12:18.076 [2024-05-15 04:13:06.052648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:18.334 [2024-05-15 04:13:06.162956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:18.334 [2024-05-15 04:13:06.232353] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:18.334 [2024-05-15 04:13:06.232392] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:19.265 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:19.266 04:13:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:19.266 malloc1 00:12:19.266 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:19.524 [2024-05-15 04:13:07.520580] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:19.524 [2024-05-15 04:13:07.520639] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:19.524 [2024-05-15 04:13:07.520670] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcabc20 00:12:19.524 [2024-05-15 04:13:07.520686] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:19.524 [2024-05-15 04:13:07.522468] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:19.524 [2024-05-15 04:13:07.522497] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:19.524 pt1 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:19.781 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:20.039 malloc2 00:12:20.039 04:13:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:20.297 [2024-05-15 04:13:08.102257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:20.297 [2024-05-15 04:13:08.102315] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:20.297 [2024-05-15 04:13:08.102338] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xca3c00 00:12:20.297 [2024-05-15 04:13:08.102350] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:20.297 [2024-05-15 04:13:08.104082] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:20.297 [2024-05-15 04:13:08.104105] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:20.297 pt2 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:20.297 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:20.555 malloc3 00:12:20.555 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:20.813 [2024-05-15 04:13:08.686811] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:20.813 [2024-05-15 04:13:08.686883] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:20.813 [2024-05-15 04:13:08.686919] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe549c0 00:12:20.813 [2024-05-15 04:13:08.686941] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:20.813 [2024-05-15 04:13:08.688689] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:20.813 [2024-05-15 04:13:08.688719] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:20.813 pt3 00:12:20.813 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:12:20.813 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:12:20.813 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:21.096 [2024-05-15 04:13:08.967594] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:21.096 [2024-05-15 04:13:08.969056] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:21.096 [2024-05-15 04:13:08.969119] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:21.096 [2024-05-15 04:13:08.969318] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xca78e0 00:12:21.096 [2024-05-15 04:13:08.969336] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:21.096 [2024-05-15 04:13:08.969574] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xca6fa0 00:12:21.096 [2024-05-15 04:13:08.969752] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xca78e0 00:12:21.096 [2024-05-15 04:13:08.969769] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xca78e0 00:12:21.096 [2024-05-15 04:13:08.969918] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.096 04:13:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:21.367 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:21.367 "name": "raid_bdev1", 00:12:21.367 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:21.367 "strip_size_kb": 64, 00:12:21.367 "state": "online", 00:12:21.367 "raid_level": "concat", 00:12:21.367 "superblock": true, 00:12:21.367 "num_base_bdevs": 3, 00:12:21.367 "num_base_bdevs_discovered": 3, 00:12:21.367 "num_base_bdevs_operational": 3, 00:12:21.367 "base_bdevs_list": [ 00:12:21.367 { 00:12:21.367 "name": "pt1", 00:12:21.367 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:21.367 "is_configured": true, 00:12:21.367 "data_offset": 2048, 00:12:21.367 "data_size": 63488 00:12:21.367 }, 00:12:21.367 { 00:12:21.367 "name": "pt2", 00:12:21.367 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:21.367 "is_configured": true, 00:12:21.367 "data_offset": 2048, 00:12:21.367 "data_size": 63488 00:12:21.367 }, 00:12:21.367 { 00:12:21.367 "name": "pt3", 00:12:21.367 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:21.367 "is_configured": true, 00:12:21.367 "data_offset": 2048, 00:12:21.367 "data_size": 63488 00:12:21.367 } 00:12:21.367 ] 00:12:21.367 }' 00:12:21.367 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:21.367 04:13:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:21.932 04:13:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:22.190 [2024-05-15 04:13:10.026632] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:22.190 "name": "raid_bdev1", 00:12:22.190 "aliases": [ 00:12:22.190 "bfe1e776-dc2c-49e6-a055-f588634836e1" 00:12:22.190 ], 00:12:22.190 "product_name": "Raid Volume", 00:12:22.190 "block_size": 512, 00:12:22.190 "num_blocks": 190464, 00:12:22.190 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:22.190 "assigned_rate_limits": { 00:12:22.190 "rw_ios_per_sec": 0, 00:12:22.190 "rw_mbytes_per_sec": 0, 00:12:22.190 "r_mbytes_per_sec": 0, 00:12:22.190 "w_mbytes_per_sec": 0 00:12:22.190 }, 00:12:22.190 "claimed": false, 00:12:22.190 "zoned": false, 00:12:22.190 "supported_io_types": { 00:12:22.190 "read": true, 00:12:22.190 "write": true, 00:12:22.190 "unmap": true, 00:12:22.190 "write_zeroes": true, 00:12:22.190 "flush": true, 00:12:22.190 "reset": true, 00:12:22.190 "compare": false, 00:12:22.190 "compare_and_write": false, 00:12:22.190 "abort": false, 00:12:22.190 "nvme_admin": false, 00:12:22.190 "nvme_io": false 00:12:22.190 }, 00:12:22.190 "memory_domains": [ 00:12:22.190 { 00:12:22.190 "dma_device_id": "system", 00:12:22.190 "dma_device_type": 1 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.190 "dma_device_type": 2 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "dma_device_id": "system", 00:12:22.190 "dma_device_type": 1 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.190 "dma_device_type": 2 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "dma_device_id": "system", 00:12:22.190 "dma_device_type": 1 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.190 "dma_device_type": 2 00:12:22.190 } 00:12:22.190 ], 00:12:22.190 "driver_specific": { 00:12:22.190 "raid": { 00:12:22.190 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:22.190 "strip_size_kb": 64, 00:12:22.190 "state": "online", 00:12:22.190 "raid_level": "concat", 00:12:22.190 "superblock": true, 00:12:22.190 "num_base_bdevs": 3, 00:12:22.190 "num_base_bdevs_discovered": 3, 00:12:22.190 "num_base_bdevs_operational": 3, 00:12:22.190 "base_bdevs_list": [ 00:12:22.190 { 00:12:22.190 "name": "pt1", 00:12:22.190 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:22.190 "is_configured": true, 00:12:22.190 "data_offset": 2048, 00:12:22.190 "data_size": 63488 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "name": "pt2", 00:12:22.190 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:22.190 "is_configured": true, 00:12:22.190 "data_offset": 2048, 00:12:22.190 "data_size": 63488 00:12:22.190 }, 00:12:22.190 { 00:12:22.190 "name": "pt3", 00:12:22.190 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:22.190 "is_configured": true, 00:12:22.190 "data_offset": 2048, 00:12:22.190 "data_size": 63488 00:12:22.190 } 00:12:22.190 ] 00:12:22.190 } 00:12:22.190 } 00:12:22.190 }' 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:12:22.190 pt2 00:12:22.190 pt3' 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:22.190 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:22.448 "name": "pt1", 00:12:22.448 "aliases": [ 00:12:22.448 "a25d7e91-cb03-50d0-9f12-1c8920165b45" 00:12:22.448 ], 00:12:22.448 "product_name": "passthru", 00:12:22.448 "block_size": 512, 00:12:22.448 "num_blocks": 65536, 00:12:22.448 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:22.448 "assigned_rate_limits": { 00:12:22.448 "rw_ios_per_sec": 0, 00:12:22.448 "rw_mbytes_per_sec": 0, 00:12:22.448 "r_mbytes_per_sec": 0, 00:12:22.448 "w_mbytes_per_sec": 0 00:12:22.448 }, 00:12:22.448 "claimed": true, 00:12:22.448 "claim_type": "exclusive_write", 00:12:22.448 "zoned": false, 00:12:22.448 "supported_io_types": { 00:12:22.448 "read": true, 00:12:22.448 "write": true, 00:12:22.448 "unmap": true, 00:12:22.448 "write_zeroes": true, 00:12:22.448 "flush": true, 00:12:22.448 "reset": true, 00:12:22.448 "compare": false, 00:12:22.448 "compare_and_write": false, 00:12:22.448 "abort": true, 00:12:22.448 "nvme_admin": false, 00:12:22.448 "nvme_io": false 00:12:22.448 }, 00:12:22.448 "memory_domains": [ 00:12:22.448 { 00:12:22.448 "dma_device_id": "system", 00:12:22.448 "dma_device_type": 1 00:12:22.448 }, 00:12:22.448 { 00:12:22.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.448 "dma_device_type": 2 00:12:22.448 } 00:12:22.448 ], 00:12:22.448 "driver_specific": { 00:12:22.448 "passthru": { 00:12:22.448 "name": "pt1", 00:12:22.448 "base_bdev_name": "malloc1" 00:12:22.448 } 00:12:22.448 } 00:12:22.448 }' 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:22.448 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:22.705 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:22.962 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:22.962 "name": "pt2", 00:12:22.962 "aliases": [ 00:12:22.962 "adb819b8-f3b9-54f2-87c2-0cd898ffa637" 00:12:22.962 ], 00:12:22.962 "product_name": "passthru", 00:12:22.962 "block_size": 512, 00:12:22.962 "num_blocks": 65536, 00:12:22.962 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:22.962 "assigned_rate_limits": { 00:12:22.962 "rw_ios_per_sec": 0, 00:12:22.962 "rw_mbytes_per_sec": 0, 00:12:22.962 "r_mbytes_per_sec": 0, 00:12:22.962 "w_mbytes_per_sec": 0 00:12:22.962 }, 00:12:22.962 "claimed": true, 00:12:22.962 "claim_type": "exclusive_write", 00:12:22.962 "zoned": false, 00:12:22.962 "supported_io_types": { 00:12:22.962 "read": true, 00:12:22.962 "write": true, 00:12:22.962 "unmap": true, 00:12:22.962 "write_zeroes": true, 00:12:22.962 "flush": true, 00:12:22.962 "reset": true, 00:12:22.962 "compare": false, 00:12:22.962 "compare_and_write": false, 00:12:22.962 "abort": true, 00:12:22.962 "nvme_admin": false, 00:12:22.962 "nvme_io": false 00:12:22.962 }, 00:12:22.962 "memory_domains": [ 00:12:22.962 { 00:12:22.962 "dma_device_id": "system", 00:12:22.962 "dma_device_type": 1 00:12:22.962 }, 00:12:22.962 { 00:12:22.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.962 "dma_device_type": 2 00:12:22.962 } 00:12:22.962 ], 00:12:22.962 "driver_specific": { 00:12:22.962 "passthru": { 00:12:22.962 "name": "pt2", 00:12:22.962 "base_bdev_name": "malloc2" 00:12:22.962 } 00:12:22.962 } 00:12:22.962 }' 00:12:22.962 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:22.962 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:22.962 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:22.962 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:23.219 04:13:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:23.219 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:23.477 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:23.477 "name": "pt3", 00:12:23.477 "aliases": [ 00:12:23.477 "6fda7b79-5683-5da9-85e5-c1493c089d66" 00:12:23.477 ], 00:12:23.477 "product_name": "passthru", 00:12:23.477 "block_size": 512, 00:12:23.477 "num_blocks": 65536, 00:12:23.477 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:23.477 "assigned_rate_limits": { 00:12:23.477 "rw_ios_per_sec": 0, 00:12:23.477 "rw_mbytes_per_sec": 0, 00:12:23.477 "r_mbytes_per_sec": 0, 00:12:23.477 "w_mbytes_per_sec": 0 00:12:23.477 }, 00:12:23.477 "claimed": true, 00:12:23.477 "claim_type": "exclusive_write", 00:12:23.477 "zoned": false, 00:12:23.477 "supported_io_types": { 00:12:23.477 "read": true, 00:12:23.477 "write": true, 00:12:23.477 "unmap": true, 00:12:23.477 "write_zeroes": true, 00:12:23.477 "flush": true, 00:12:23.477 "reset": true, 00:12:23.477 "compare": false, 00:12:23.477 "compare_and_write": false, 00:12:23.477 "abort": true, 00:12:23.477 "nvme_admin": false, 00:12:23.477 "nvme_io": false 00:12:23.477 }, 00:12:23.477 "memory_domains": [ 00:12:23.477 { 00:12:23.477 "dma_device_id": "system", 00:12:23.477 "dma_device_type": 1 00:12:23.477 }, 00:12:23.477 { 00:12:23.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.477 "dma_device_type": 2 00:12:23.477 } 00:12:23.477 ], 00:12:23.477 "driver_specific": { 00:12:23.477 "passthru": { 00:12:23.477 "name": "pt3", 00:12:23.477 "base_bdev_name": "malloc3" 00:12:23.477 } 00:12:23.477 } 00:12:23.477 }' 00:12:23.477 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:23.477 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:23.477 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:23.477 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:23.734 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:12:23.992 [2024-05-15 04:13:11.963691] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:23.992 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=bfe1e776-dc2c-49e6-a055-f588634836e1 00:12:23.992 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z bfe1e776-dc2c-49e6-a055-f588634836e1 ']' 00:12:23.992 04:13:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:24.249 [2024-05-15 04:13:12.204089] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:24.249 [2024-05-15 04:13:12.204131] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:24.249 [2024-05-15 04:13:12.204212] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:24.249 [2024-05-15 04:13:12.204272] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:24.249 [2024-05-15 04:13:12.204286] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xca78e0 name raid_bdev1, state offline 00:12:24.249 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.249 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:12:24.506 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:12:24.506 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:12:24.506 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:24.506 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:24.764 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:24.764 04:13:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:25.021 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:12:25.021 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:25.586 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:12:26.156 [2024-05-15 04:13:13.864480] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:26.156 [2024-05-15 04:13:13.865981] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:26.156 [2024-05-15 04:13:13.866034] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:12:26.156 [2024-05-15 04:13:13.866099] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:26.156 [2024-05-15 04:13:13.866158] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:26.156 [2024-05-15 04:13:13.866192] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:12:26.156 [2024-05-15 04:13:13.866217] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:26.156 [2024-05-15 04:13:13.866231] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xca5530 name raid_bdev1, state configuring 00:12:26.156 request: 00:12:26.156 { 00:12:26.156 "name": "raid_bdev1", 00:12:26.156 "raid_level": "concat", 00:12:26.156 "base_bdevs": [ 00:12:26.156 "malloc1", 00:12:26.156 "malloc2", 00:12:26.156 "malloc3" 00:12:26.156 ], 00:12:26.156 "superblock": false, 00:12:26.156 "strip_size_kb": 64, 00:12:26.156 "method": "bdev_raid_create", 00:12:26.156 "req_id": 1 00:12:26.156 } 00:12:26.156 Got JSON-RPC error response 00:12:26.156 response: 00:12:26.156 { 00:12:26.156 "code": -17, 00:12:26.156 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:26.156 } 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.156 04:13:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:12:26.156 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:12:26.156 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:12:26.156 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:26.721 [2024-05-15 04:13:14.437898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:26.721 [2024-05-15 04:13:14.437960] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.721 [2024-05-15 04:13:14.437991] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe5edd0 00:12:26.721 [2024-05-15 04:13:14.438008] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.721 [2024-05-15 04:13:14.439802] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.721 [2024-05-15 04:13:14.439843] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:26.721 [2024-05-15 04:13:14.439941] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:26.721 [2024-05-15 04:13:14.439986] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:26.721 pt1 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:26.721 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.722 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.980 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:26.980 "name": "raid_bdev1", 00:12:26.980 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:26.980 "strip_size_kb": 64, 00:12:26.980 "state": "configuring", 00:12:26.980 "raid_level": "concat", 00:12:26.980 "superblock": true, 00:12:26.980 "num_base_bdevs": 3, 00:12:26.980 "num_base_bdevs_discovered": 1, 00:12:26.980 "num_base_bdevs_operational": 3, 00:12:26.980 "base_bdevs_list": [ 00:12:26.980 { 00:12:26.980 "name": "pt1", 00:12:26.980 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:26.980 "is_configured": true, 00:12:26.980 "data_offset": 2048, 00:12:26.980 "data_size": 63488 00:12:26.980 }, 00:12:26.980 { 00:12:26.980 "name": null, 00:12:26.980 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:26.980 "is_configured": false, 00:12:26.980 "data_offset": 2048, 00:12:26.980 "data_size": 63488 00:12:26.980 }, 00:12:26.980 { 00:12:26.980 "name": null, 00:12:26.980 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:26.980 "is_configured": false, 00:12:26.980 "data_offset": 2048, 00:12:26.980 "data_size": 63488 00:12:26.980 } 00:12:26.980 ] 00:12:26.980 }' 00:12:26.980 04:13:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:26.980 04:13:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.547 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:12:27.547 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:27.804 [2024-05-15 04:13:15.597011] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:27.804 [2024-05-15 04:13:15.597075] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:27.804 [2024-05-15 04:13:15.597104] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xca3e30 00:12:27.804 [2024-05-15 04:13:15.597143] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:27.804 [2024-05-15 04:13:15.597597] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:27.804 [2024-05-15 04:13:15.597625] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:27.804 [2024-05-15 04:13:15.597718] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:27.804 [2024-05-15 04:13:15.597750] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:27.804 pt2 00:12:27.804 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:28.062 [2024-05-15 04:13:15.865735] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.062 04:13:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.319 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:28.319 "name": "raid_bdev1", 00:12:28.319 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:28.319 "strip_size_kb": 64, 00:12:28.319 "state": "configuring", 00:12:28.319 "raid_level": "concat", 00:12:28.319 "superblock": true, 00:12:28.319 "num_base_bdevs": 3, 00:12:28.320 "num_base_bdevs_discovered": 1, 00:12:28.320 "num_base_bdevs_operational": 3, 00:12:28.320 "base_bdevs_list": [ 00:12:28.320 { 00:12:28.320 "name": "pt1", 00:12:28.320 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:28.320 "is_configured": true, 00:12:28.320 "data_offset": 2048, 00:12:28.320 "data_size": 63488 00:12:28.320 }, 00:12:28.320 { 00:12:28.320 "name": null, 00:12:28.320 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:28.320 "is_configured": false, 00:12:28.320 "data_offset": 2048, 00:12:28.320 "data_size": 63488 00:12:28.320 }, 00:12:28.320 { 00:12:28.320 "name": null, 00:12:28.320 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:28.320 "is_configured": false, 00:12:28.320 "data_offset": 2048, 00:12:28.320 "data_size": 63488 00:12:28.320 } 00:12:28.320 ] 00:12:28.320 }' 00:12:28.320 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:28.320 04:13:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:28.884 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:12:28.884 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:28.884 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:29.142 [2024-05-15 04:13:16.936552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:29.142 [2024-05-15 04:13:16.936626] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:29.142 [2024-05-15 04:13:16.936658] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcac5e0 00:12:29.142 [2024-05-15 04:13:16.936675] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:29.142 [2024-05-15 04:13:16.937111] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:29.142 [2024-05-15 04:13:16.937137] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:29.142 [2024-05-15 04:13:16.937222] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:29.142 [2024-05-15 04:13:16.937252] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:29.142 pt2 00:12:29.142 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:12:29.142 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:29.142 04:13:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:29.399 [2024-05-15 04:13:17.181195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:29.399 [2024-05-15 04:13:17.181249] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:29.399 [2024-05-15 04:13:17.181271] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcaadc0 00:12:29.399 [2024-05-15 04:13:17.181286] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:29.399 [2024-05-15 04:13:17.181597] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:29.399 [2024-05-15 04:13:17.181623] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:29.399 [2024-05-15 04:13:17.181690] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:12:29.399 [2024-05-15 04:13:17.181716] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:29.399 [2024-05-15 04:13:17.181866] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xcacf80 00:12:29.399 [2024-05-15 04:13:17.181884] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:29.399 [2024-05-15 04:13:17.182055] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xca40c0 00:12:29.399 [2024-05-15 04:13:17.182209] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcacf80 00:12:29.399 [2024-05-15 04:13:17.182226] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcacf80 00:12:29.399 [2024-05-15 04:13:17.182333] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.399 pt3 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.399 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:29.657 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:29.657 "name": "raid_bdev1", 00:12:29.657 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:29.657 "strip_size_kb": 64, 00:12:29.657 "state": "online", 00:12:29.657 "raid_level": "concat", 00:12:29.657 "superblock": true, 00:12:29.657 "num_base_bdevs": 3, 00:12:29.657 "num_base_bdevs_discovered": 3, 00:12:29.657 "num_base_bdevs_operational": 3, 00:12:29.657 "base_bdevs_list": [ 00:12:29.657 { 00:12:29.657 "name": "pt1", 00:12:29.657 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:29.657 "is_configured": true, 00:12:29.657 "data_offset": 2048, 00:12:29.657 "data_size": 63488 00:12:29.657 }, 00:12:29.657 { 00:12:29.657 "name": "pt2", 00:12:29.657 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:29.657 "is_configured": true, 00:12:29.657 "data_offset": 2048, 00:12:29.657 "data_size": 63488 00:12:29.657 }, 00:12:29.657 { 00:12:29.657 "name": "pt3", 00:12:29.657 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:29.657 "is_configured": true, 00:12:29.657 "data_offset": 2048, 00:12:29.657 "data_size": 63488 00:12:29.657 } 00:12:29.657 ] 00:12:29.657 }' 00:12:29.657 04:13:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:29.657 04:13:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:30.223 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:30.223 [2024-05-15 04:13:18.236256] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:30.481 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:30.481 "name": "raid_bdev1", 00:12:30.481 "aliases": [ 00:12:30.481 "bfe1e776-dc2c-49e6-a055-f588634836e1" 00:12:30.481 ], 00:12:30.481 "product_name": "Raid Volume", 00:12:30.481 "block_size": 512, 00:12:30.481 "num_blocks": 190464, 00:12:30.481 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:30.481 "assigned_rate_limits": { 00:12:30.481 "rw_ios_per_sec": 0, 00:12:30.481 "rw_mbytes_per_sec": 0, 00:12:30.481 "r_mbytes_per_sec": 0, 00:12:30.481 "w_mbytes_per_sec": 0 00:12:30.481 }, 00:12:30.481 "claimed": false, 00:12:30.481 "zoned": false, 00:12:30.481 "supported_io_types": { 00:12:30.481 "read": true, 00:12:30.481 "write": true, 00:12:30.481 "unmap": true, 00:12:30.481 "write_zeroes": true, 00:12:30.481 "flush": true, 00:12:30.481 "reset": true, 00:12:30.481 "compare": false, 00:12:30.481 "compare_and_write": false, 00:12:30.481 "abort": false, 00:12:30.481 "nvme_admin": false, 00:12:30.481 "nvme_io": false 00:12:30.481 }, 00:12:30.481 "memory_domains": [ 00:12:30.481 { 00:12:30.481 "dma_device_id": "system", 00:12:30.481 "dma_device_type": 1 00:12:30.481 }, 00:12:30.482 { 00:12:30.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.482 "dma_device_type": 2 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "dma_device_id": "system", 00:12:30.482 "dma_device_type": 1 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.482 "dma_device_type": 2 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "dma_device_id": "system", 00:12:30.482 "dma_device_type": 1 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.482 "dma_device_type": 2 00:12:30.482 } 00:12:30.482 ], 00:12:30.482 "driver_specific": { 00:12:30.482 "raid": { 00:12:30.482 "uuid": "bfe1e776-dc2c-49e6-a055-f588634836e1", 00:12:30.482 "strip_size_kb": 64, 00:12:30.482 "state": "online", 00:12:30.482 "raid_level": "concat", 00:12:30.482 "superblock": true, 00:12:30.482 "num_base_bdevs": 3, 00:12:30.482 "num_base_bdevs_discovered": 3, 00:12:30.482 "num_base_bdevs_operational": 3, 00:12:30.482 "base_bdevs_list": [ 00:12:30.482 { 00:12:30.482 "name": "pt1", 00:12:30.482 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:30.482 "is_configured": true, 00:12:30.482 "data_offset": 2048, 00:12:30.482 "data_size": 63488 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "name": "pt2", 00:12:30.482 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:30.482 "is_configured": true, 00:12:30.482 "data_offset": 2048, 00:12:30.482 "data_size": 63488 00:12:30.482 }, 00:12:30.482 { 00:12:30.482 "name": "pt3", 00:12:30.482 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:30.482 "is_configured": true, 00:12:30.482 "data_offset": 2048, 00:12:30.482 "data_size": 63488 00:12:30.482 } 00:12:30.482 ] 00:12:30.482 } 00:12:30.482 } 00:12:30.482 }' 00:12:30.482 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:30.482 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:12:30.482 pt2 00:12:30.482 pt3' 00:12:30.482 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:30.482 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:30.482 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:30.740 "name": "pt1", 00:12:30.740 "aliases": [ 00:12:30.740 "a25d7e91-cb03-50d0-9f12-1c8920165b45" 00:12:30.740 ], 00:12:30.740 "product_name": "passthru", 00:12:30.740 "block_size": 512, 00:12:30.740 "num_blocks": 65536, 00:12:30.740 "uuid": "a25d7e91-cb03-50d0-9f12-1c8920165b45", 00:12:30.740 "assigned_rate_limits": { 00:12:30.740 "rw_ios_per_sec": 0, 00:12:30.740 "rw_mbytes_per_sec": 0, 00:12:30.740 "r_mbytes_per_sec": 0, 00:12:30.740 "w_mbytes_per_sec": 0 00:12:30.740 }, 00:12:30.740 "claimed": true, 00:12:30.740 "claim_type": "exclusive_write", 00:12:30.740 "zoned": false, 00:12:30.740 "supported_io_types": { 00:12:30.740 "read": true, 00:12:30.740 "write": true, 00:12:30.740 "unmap": true, 00:12:30.740 "write_zeroes": true, 00:12:30.740 "flush": true, 00:12:30.740 "reset": true, 00:12:30.740 "compare": false, 00:12:30.740 "compare_and_write": false, 00:12:30.740 "abort": true, 00:12:30.740 "nvme_admin": false, 00:12:30.740 "nvme_io": false 00:12:30.740 }, 00:12:30.740 "memory_domains": [ 00:12:30.740 { 00:12:30.740 "dma_device_id": "system", 00:12:30.740 "dma_device_type": 1 00:12:30.740 }, 00:12:30.740 { 00:12:30.740 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.740 "dma_device_type": 2 00:12:30.740 } 00:12:30.740 ], 00:12:30.740 "driver_specific": { 00:12:30.740 "passthru": { 00:12:30.740 "name": "pt1", 00:12:30.740 "base_bdev_name": "malloc1" 00:12:30.740 } 00:12:30.740 } 00:12:30.740 }' 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.740 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:30.998 04:13:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:31.256 "name": "pt2", 00:12:31.256 "aliases": [ 00:12:31.256 "adb819b8-f3b9-54f2-87c2-0cd898ffa637" 00:12:31.256 ], 00:12:31.256 "product_name": "passthru", 00:12:31.256 "block_size": 512, 00:12:31.256 "num_blocks": 65536, 00:12:31.256 "uuid": "adb819b8-f3b9-54f2-87c2-0cd898ffa637", 00:12:31.256 "assigned_rate_limits": { 00:12:31.256 "rw_ios_per_sec": 0, 00:12:31.256 "rw_mbytes_per_sec": 0, 00:12:31.256 "r_mbytes_per_sec": 0, 00:12:31.256 "w_mbytes_per_sec": 0 00:12:31.256 }, 00:12:31.256 "claimed": true, 00:12:31.256 "claim_type": "exclusive_write", 00:12:31.256 "zoned": false, 00:12:31.256 "supported_io_types": { 00:12:31.256 "read": true, 00:12:31.256 "write": true, 00:12:31.256 "unmap": true, 00:12:31.256 "write_zeroes": true, 00:12:31.256 "flush": true, 00:12:31.256 "reset": true, 00:12:31.256 "compare": false, 00:12:31.256 "compare_and_write": false, 00:12:31.256 "abort": true, 00:12:31.256 "nvme_admin": false, 00:12:31.256 "nvme_io": false 00:12:31.256 }, 00:12:31.256 "memory_domains": [ 00:12:31.256 { 00:12:31.256 "dma_device_id": "system", 00:12:31.256 "dma_device_type": 1 00:12:31.256 }, 00:12:31.256 { 00:12:31.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.256 "dma_device_type": 2 00:12:31.256 } 00:12:31.256 ], 00:12:31.256 "driver_specific": { 00:12:31.256 "passthru": { 00:12:31.256 "name": "pt2", 00:12:31.256 "base_bdev_name": "malloc2" 00:12:31.256 } 00:12:31.256 } 00:12:31.256 }' 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:31.256 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:31.513 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:31.771 "name": "pt3", 00:12:31.771 "aliases": [ 00:12:31.771 "6fda7b79-5683-5da9-85e5-c1493c089d66" 00:12:31.771 ], 00:12:31.771 "product_name": "passthru", 00:12:31.771 "block_size": 512, 00:12:31.771 "num_blocks": 65536, 00:12:31.771 "uuid": "6fda7b79-5683-5da9-85e5-c1493c089d66", 00:12:31.771 "assigned_rate_limits": { 00:12:31.771 "rw_ios_per_sec": 0, 00:12:31.771 "rw_mbytes_per_sec": 0, 00:12:31.771 "r_mbytes_per_sec": 0, 00:12:31.771 "w_mbytes_per_sec": 0 00:12:31.771 }, 00:12:31.771 "claimed": true, 00:12:31.771 "claim_type": "exclusive_write", 00:12:31.771 "zoned": false, 00:12:31.771 "supported_io_types": { 00:12:31.771 "read": true, 00:12:31.771 "write": true, 00:12:31.771 "unmap": true, 00:12:31.771 "write_zeroes": true, 00:12:31.771 "flush": true, 00:12:31.771 "reset": true, 00:12:31.771 "compare": false, 00:12:31.771 "compare_and_write": false, 00:12:31.771 "abort": true, 00:12:31.771 "nvme_admin": false, 00:12:31.771 "nvme_io": false 00:12:31.771 }, 00:12:31.771 "memory_domains": [ 00:12:31.771 { 00:12:31.771 "dma_device_id": "system", 00:12:31.771 "dma_device_type": 1 00:12:31.771 }, 00:12:31.771 { 00:12:31.771 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.771 "dma_device_type": 2 00:12:31.771 } 00:12:31.771 ], 00:12:31.771 "driver_specific": { 00:12:31.771 "passthru": { 00:12:31.771 "name": "pt3", 00:12:31.771 "base_bdev_name": "malloc3" 00:12:31.771 } 00:12:31.771 } 00:12:31.771 }' 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:31.771 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:32.030 04:13:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:12:32.288 [2024-05-15 04:13:20.109258] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' bfe1e776-dc2c-49e6-a055-f588634836e1 '!=' bfe1e776-dc2c-49e6-a055-f588634836e1 ']' 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3856394 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3856394 ']' 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3856394 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3856394 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3856394' 00:12:32.288 killing process with pid 3856394 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3856394 00:12:32.288 [2024-05-15 04:13:20.154104] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:32.288 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3856394 00:12:32.288 [2024-05-15 04:13:20.154213] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:32.288 [2024-05-15 04:13:20.154276] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:32.288 [2024-05-15 04:13:20.154290] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcacf80 name raid_bdev1, state offline 00:12:32.288 [2024-05-15 04:13:20.189158] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:32.546 04:13:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:12:32.546 00:12:32.546 real 0m14.525s 00:12:32.546 user 0m26.607s 00:12:32.546 sys 0m2.021s 00:12:32.546 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:12:32.546 04:13:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.546 ************************************ 00:12:32.546 END TEST raid_superblock_test 00:12:32.546 ************************************ 00:12:32.546 04:13:20 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:12:32.546 04:13:20 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:12:32.546 04:13:20 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:12:32.546 04:13:20 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:12:32.546 04:13:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:32.546 ************************************ 00:12:32.546 START TEST raid_state_function_test 00:12:32.546 ************************************ 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 false 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:12:32.546 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3858413 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3858413' 00:12:32.547 Process raid pid: 3858413 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3858413 /var/tmp/spdk-raid.sock 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3858413 ']' 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:32.547 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:12:32.547 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.547 [2024-05-15 04:13:20.559992] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:12:32.547 [2024-05-15 04:13:20.560070] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:32.805 [2024-05-15 04:13:20.640801] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:32.805 [2024-05-15 04:13:20.757529] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:12:33.063 [2024-05-15 04:13:20.824940] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.063 [2024-05-15 04:13:20.824979] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:33.063 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:12:33.063 04:13:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:12:33.063 04:13:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:33.321 [2024-05-15 04:13:21.100686] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:33.321 [2024-05-15 04:13:21.100733] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:33.321 [2024-05-15 04:13:21.100754] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.321 [2024-05-15 04:13:21.100767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.321 [2024-05-15 04:13:21.100777] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:33.321 [2024-05-15 04:13:21.100789] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.321 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.579 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:33.579 "name": "Existed_Raid", 00:12:33.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.579 "strip_size_kb": 0, 00:12:33.579 "state": "configuring", 00:12:33.579 "raid_level": "raid1", 00:12:33.579 "superblock": false, 00:12:33.579 "num_base_bdevs": 3, 00:12:33.579 "num_base_bdevs_discovered": 0, 00:12:33.579 "num_base_bdevs_operational": 3, 00:12:33.579 "base_bdevs_list": [ 00:12:33.579 { 00:12:33.579 "name": "BaseBdev1", 00:12:33.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.579 "is_configured": false, 00:12:33.579 "data_offset": 0, 00:12:33.579 "data_size": 0 00:12:33.579 }, 00:12:33.579 { 00:12:33.579 "name": "BaseBdev2", 00:12:33.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.579 "is_configured": false, 00:12:33.579 "data_offset": 0, 00:12:33.579 "data_size": 0 00:12:33.579 }, 00:12:33.579 { 00:12:33.579 "name": "BaseBdev3", 00:12:33.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.579 "is_configured": false, 00:12:33.579 "data_offset": 0, 00:12:33.579 "data_size": 0 00:12:33.579 } 00:12:33.579 ] 00:12:33.579 }' 00:12:33.579 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:33.579 04:13:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.144 04:13:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:34.401 [2024-05-15 04:13:22.171435] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:34.401 [2024-05-15 04:13:22.171468] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d5020 name Existed_Raid, state configuring 00:12:34.401 04:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:34.660 [2024-05-15 04:13:22.460214] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:34.660 [2024-05-15 04:13:22.460252] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:34.660 [2024-05-15 04:13:22.460264] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:34.660 [2024-05-15 04:13:22.460277] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:34.660 [2024-05-15 04:13:22.460286] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:34.660 [2024-05-15 04:13:22.460298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:34.660 04:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:34.917 [2024-05-15 04:13:22.725692] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:34.917 BaseBdev1 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:34.917 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:35.182 04:13:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:35.494 [ 00:12:35.494 { 00:12:35.494 "name": "BaseBdev1", 00:12:35.494 "aliases": [ 00:12:35.494 "8c9462bc-39ad-4465-b65e-677a20974721" 00:12:35.494 ], 00:12:35.494 "product_name": "Malloc disk", 00:12:35.494 "block_size": 512, 00:12:35.494 "num_blocks": 65536, 00:12:35.494 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:35.494 "assigned_rate_limits": { 00:12:35.494 "rw_ios_per_sec": 0, 00:12:35.494 "rw_mbytes_per_sec": 0, 00:12:35.494 "r_mbytes_per_sec": 0, 00:12:35.494 "w_mbytes_per_sec": 0 00:12:35.494 }, 00:12:35.494 "claimed": true, 00:12:35.494 "claim_type": "exclusive_write", 00:12:35.494 "zoned": false, 00:12:35.494 "supported_io_types": { 00:12:35.494 "read": true, 00:12:35.494 "write": true, 00:12:35.494 "unmap": true, 00:12:35.494 "write_zeroes": true, 00:12:35.494 "flush": true, 00:12:35.494 "reset": true, 00:12:35.494 "compare": false, 00:12:35.494 "compare_and_write": false, 00:12:35.494 "abort": true, 00:12:35.494 "nvme_admin": false, 00:12:35.494 "nvme_io": false 00:12:35.494 }, 00:12:35.494 "memory_domains": [ 00:12:35.494 { 00:12:35.494 "dma_device_id": "system", 00:12:35.494 "dma_device_type": 1 00:12:35.494 }, 00:12:35.494 { 00:12:35.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:35.494 "dma_device_type": 2 00:12:35.494 } 00:12:35.494 ], 00:12:35.494 "driver_specific": {} 00:12:35.494 } 00:12:35.494 ] 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:35.494 "name": "Existed_Raid", 00:12:35.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.494 "strip_size_kb": 0, 00:12:35.494 "state": "configuring", 00:12:35.494 "raid_level": "raid1", 00:12:35.494 "superblock": false, 00:12:35.494 "num_base_bdevs": 3, 00:12:35.494 "num_base_bdevs_discovered": 1, 00:12:35.494 "num_base_bdevs_operational": 3, 00:12:35.494 "base_bdevs_list": [ 00:12:35.494 { 00:12:35.494 "name": "BaseBdev1", 00:12:35.494 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:35.494 "is_configured": true, 00:12:35.494 "data_offset": 0, 00:12:35.494 "data_size": 65536 00:12:35.494 }, 00:12:35.494 { 00:12:35.494 "name": "BaseBdev2", 00:12:35.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.494 "is_configured": false, 00:12:35.494 "data_offset": 0, 00:12:35.494 "data_size": 0 00:12:35.494 }, 00:12:35.494 { 00:12:35.494 "name": "BaseBdev3", 00:12:35.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.494 "is_configured": false, 00:12:35.494 "data_offset": 0, 00:12:35.494 "data_size": 0 00:12:35.494 } 00:12:35.494 ] 00:12:35.494 }' 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:35.494 04:13:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.061 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:36.320 [2024-05-15 04:13:24.305934] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:36.320 [2024-05-15 04:13:24.305993] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d48f0 name Existed_Raid, state configuring 00:12:36.320 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:36.578 [2024-05-15 04:13:24.586726] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:36.578 [2024-05-15 04:13:24.588376] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:36.578 [2024-05-15 04:13:24.588423] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:36.578 [2024-05-15 04:13:24.588436] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:36.578 [2024-05-15 04:13:24.588449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.837 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:37.095 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:37.095 "name": "Existed_Raid", 00:12:37.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.095 "strip_size_kb": 0, 00:12:37.095 "state": "configuring", 00:12:37.095 "raid_level": "raid1", 00:12:37.095 "superblock": false, 00:12:37.095 "num_base_bdevs": 3, 00:12:37.095 "num_base_bdevs_discovered": 1, 00:12:37.095 "num_base_bdevs_operational": 3, 00:12:37.095 "base_bdevs_list": [ 00:12:37.095 { 00:12:37.095 "name": "BaseBdev1", 00:12:37.095 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:37.095 "is_configured": true, 00:12:37.095 "data_offset": 0, 00:12:37.095 "data_size": 65536 00:12:37.095 }, 00:12:37.095 { 00:12:37.095 "name": "BaseBdev2", 00:12:37.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.095 "is_configured": false, 00:12:37.095 "data_offset": 0, 00:12:37.095 "data_size": 0 00:12:37.095 }, 00:12:37.095 { 00:12:37.095 "name": "BaseBdev3", 00:12:37.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:37.095 "is_configured": false, 00:12:37.095 "data_offset": 0, 00:12:37.095 "data_size": 0 00:12:37.095 } 00:12:37.095 ] 00:12:37.095 }' 00:12:37.095 04:13:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:37.095 04:13:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.661 04:13:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:37.919 [2024-05-15 04:13:25.696151] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:37.919 BaseBdev2 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:37.919 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:38.177 04:13:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:38.435 [ 00:12:38.435 { 00:12:38.435 "name": "BaseBdev2", 00:12:38.435 "aliases": [ 00:12:38.435 "f457c505-4fc1-42eb-bdca-820fc7cbd34b" 00:12:38.435 ], 00:12:38.435 "product_name": "Malloc disk", 00:12:38.435 "block_size": 512, 00:12:38.435 "num_blocks": 65536, 00:12:38.435 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:38.435 "assigned_rate_limits": { 00:12:38.435 "rw_ios_per_sec": 0, 00:12:38.435 "rw_mbytes_per_sec": 0, 00:12:38.435 "r_mbytes_per_sec": 0, 00:12:38.435 "w_mbytes_per_sec": 0 00:12:38.435 }, 00:12:38.435 "claimed": true, 00:12:38.435 "claim_type": "exclusive_write", 00:12:38.436 "zoned": false, 00:12:38.436 "supported_io_types": { 00:12:38.436 "read": true, 00:12:38.436 "write": true, 00:12:38.436 "unmap": true, 00:12:38.436 "write_zeroes": true, 00:12:38.436 "flush": true, 00:12:38.436 "reset": true, 00:12:38.436 "compare": false, 00:12:38.436 "compare_and_write": false, 00:12:38.436 "abort": true, 00:12:38.436 "nvme_admin": false, 00:12:38.436 "nvme_io": false 00:12:38.436 }, 00:12:38.436 "memory_domains": [ 00:12:38.436 { 00:12:38.436 "dma_device_id": "system", 00:12:38.436 "dma_device_type": 1 00:12:38.436 }, 00:12:38.436 { 00:12:38.436 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.436 "dma_device_type": 2 00:12:38.436 } 00:12:38.436 ], 00:12:38.436 "driver_specific": {} 00:12:38.436 } 00:12:38.436 ] 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.436 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.694 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:38.694 "name": "Existed_Raid", 00:12:38.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.694 "strip_size_kb": 0, 00:12:38.694 "state": "configuring", 00:12:38.694 "raid_level": "raid1", 00:12:38.694 "superblock": false, 00:12:38.694 "num_base_bdevs": 3, 00:12:38.694 "num_base_bdevs_discovered": 2, 00:12:38.694 "num_base_bdevs_operational": 3, 00:12:38.694 "base_bdevs_list": [ 00:12:38.694 { 00:12:38.694 "name": "BaseBdev1", 00:12:38.694 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:38.694 "is_configured": true, 00:12:38.694 "data_offset": 0, 00:12:38.694 "data_size": 65536 00:12:38.694 }, 00:12:38.694 { 00:12:38.694 "name": "BaseBdev2", 00:12:38.694 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:38.694 "is_configured": true, 00:12:38.694 "data_offset": 0, 00:12:38.694 "data_size": 65536 00:12:38.694 }, 00:12:38.694 { 00:12:38.694 "name": "BaseBdev3", 00:12:38.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.694 "is_configured": false, 00:12:38.694 "data_offset": 0, 00:12:38.694 "data_size": 0 00:12:38.694 } 00:12:38.694 ] 00:12:38.694 }' 00:12:38.694 04:13:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:38.694 04:13:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.259 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:39.517 [2024-05-15 04:13:27.346976] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:39.517 [2024-05-15 04:13:27.347050] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x12d57e0 00:12:39.517 [2024-05-15 04:13:27.347062] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:39.517 [2024-05-15 04:13:27.347277] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ec6d0 00:12:39.517 [2024-05-15 04:13:27.347440] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12d57e0 00:12:39.517 [2024-05-15 04:13:27.347457] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12d57e0 00:12:39.517 [2024-05-15 04:13:27.347679] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:39.517 BaseBdev3 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:39.517 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:39.776 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:40.034 [ 00:12:40.034 { 00:12:40.034 "name": "BaseBdev3", 00:12:40.034 "aliases": [ 00:12:40.034 "0e633c27-b27d-4b19-bdf3-64a08f2393e5" 00:12:40.034 ], 00:12:40.034 "product_name": "Malloc disk", 00:12:40.034 "block_size": 512, 00:12:40.034 "num_blocks": 65536, 00:12:40.034 "uuid": "0e633c27-b27d-4b19-bdf3-64a08f2393e5", 00:12:40.034 "assigned_rate_limits": { 00:12:40.034 "rw_ios_per_sec": 0, 00:12:40.034 "rw_mbytes_per_sec": 0, 00:12:40.034 "r_mbytes_per_sec": 0, 00:12:40.034 "w_mbytes_per_sec": 0 00:12:40.034 }, 00:12:40.034 "claimed": true, 00:12:40.034 "claim_type": "exclusive_write", 00:12:40.034 "zoned": false, 00:12:40.034 "supported_io_types": { 00:12:40.034 "read": true, 00:12:40.034 "write": true, 00:12:40.034 "unmap": true, 00:12:40.034 "write_zeroes": true, 00:12:40.034 "flush": true, 00:12:40.034 "reset": true, 00:12:40.034 "compare": false, 00:12:40.034 "compare_and_write": false, 00:12:40.034 "abort": true, 00:12:40.034 "nvme_admin": false, 00:12:40.034 "nvme_io": false 00:12:40.034 }, 00:12:40.034 "memory_domains": [ 00:12:40.034 { 00:12:40.034 "dma_device_id": "system", 00:12:40.034 "dma_device_type": 1 00:12:40.034 }, 00:12:40.034 { 00:12:40.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.034 "dma_device_type": 2 00:12:40.034 } 00:12:40.034 ], 00:12:40.034 "driver_specific": {} 00:12:40.034 } 00:12:40.034 ] 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.034 04:13:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.293 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:40.293 "name": "Existed_Raid", 00:12:40.293 "uuid": "4c75c8c5-bc8a-4965-8f27-ef88ce379290", 00:12:40.293 "strip_size_kb": 0, 00:12:40.293 "state": "online", 00:12:40.293 "raid_level": "raid1", 00:12:40.293 "superblock": false, 00:12:40.293 "num_base_bdevs": 3, 00:12:40.293 "num_base_bdevs_discovered": 3, 00:12:40.293 "num_base_bdevs_operational": 3, 00:12:40.293 "base_bdevs_list": [ 00:12:40.293 { 00:12:40.293 "name": "BaseBdev1", 00:12:40.293 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:40.293 "is_configured": true, 00:12:40.293 "data_offset": 0, 00:12:40.293 "data_size": 65536 00:12:40.293 }, 00:12:40.293 { 00:12:40.293 "name": "BaseBdev2", 00:12:40.293 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:40.293 "is_configured": true, 00:12:40.293 "data_offset": 0, 00:12:40.293 "data_size": 65536 00:12:40.293 }, 00:12:40.293 { 00:12:40.293 "name": "BaseBdev3", 00:12:40.293 "uuid": "0e633c27-b27d-4b19-bdf3-64a08f2393e5", 00:12:40.293 "is_configured": true, 00:12:40.293 "data_offset": 0, 00:12:40.293 "data_size": 65536 00:12:40.293 } 00:12:40.293 ] 00:12:40.293 }' 00:12:40.293 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:40.293 04:13:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:40.859 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:41.116 [2024-05-15 04:13:28.907372] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:41.116 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:41.116 "name": "Existed_Raid", 00:12:41.116 "aliases": [ 00:12:41.116 "4c75c8c5-bc8a-4965-8f27-ef88ce379290" 00:12:41.116 ], 00:12:41.116 "product_name": "Raid Volume", 00:12:41.116 "block_size": 512, 00:12:41.116 "num_blocks": 65536, 00:12:41.116 "uuid": "4c75c8c5-bc8a-4965-8f27-ef88ce379290", 00:12:41.116 "assigned_rate_limits": { 00:12:41.116 "rw_ios_per_sec": 0, 00:12:41.116 "rw_mbytes_per_sec": 0, 00:12:41.116 "r_mbytes_per_sec": 0, 00:12:41.116 "w_mbytes_per_sec": 0 00:12:41.116 }, 00:12:41.116 "claimed": false, 00:12:41.116 "zoned": false, 00:12:41.116 "supported_io_types": { 00:12:41.117 "read": true, 00:12:41.117 "write": true, 00:12:41.117 "unmap": false, 00:12:41.117 "write_zeroes": true, 00:12:41.117 "flush": false, 00:12:41.117 "reset": true, 00:12:41.117 "compare": false, 00:12:41.117 "compare_and_write": false, 00:12:41.117 "abort": false, 00:12:41.117 "nvme_admin": false, 00:12:41.117 "nvme_io": false 00:12:41.117 }, 00:12:41.117 "memory_domains": [ 00:12:41.117 { 00:12:41.117 "dma_device_id": "system", 00:12:41.117 "dma_device_type": 1 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.117 "dma_device_type": 2 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "dma_device_id": "system", 00:12:41.117 "dma_device_type": 1 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.117 "dma_device_type": 2 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "dma_device_id": "system", 00:12:41.117 "dma_device_type": 1 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.117 "dma_device_type": 2 00:12:41.117 } 00:12:41.117 ], 00:12:41.117 "driver_specific": { 00:12:41.117 "raid": { 00:12:41.117 "uuid": "4c75c8c5-bc8a-4965-8f27-ef88ce379290", 00:12:41.117 "strip_size_kb": 0, 00:12:41.117 "state": "online", 00:12:41.117 "raid_level": "raid1", 00:12:41.117 "superblock": false, 00:12:41.117 "num_base_bdevs": 3, 00:12:41.117 "num_base_bdevs_discovered": 3, 00:12:41.117 "num_base_bdevs_operational": 3, 00:12:41.117 "base_bdevs_list": [ 00:12:41.117 { 00:12:41.117 "name": "BaseBdev1", 00:12:41.117 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:41.117 "is_configured": true, 00:12:41.117 "data_offset": 0, 00:12:41.117 "data_size": 65536 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "name": "BaseBdev2", 00:12:41.117 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:41.117 "is_configured": true, 00:12:41.117 "data_offset": 0, 00:12:41.117 "data_size": 65536 00:12:41.117 }, 00:12:41.117 { 00:12:41.117 "name": "BaseBdev3", 00:12:41.117 "uuid": "0e633c27-b27d-4b19-bdf3-64a08f2393e5", 00:12:41.117 "is_configured": true, 00:12:41.117 "data_offset": 0, 00:12:41.117 "data_size": 65536 00:12:41.117 } 00:12:41.117 ] 00:12:41.117 } 00:12:41.117 } 00:12:41.117 }' 00:12:41.117 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:41.117 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:12:41.117 BaseBdev2 00:12:41.117 BaseBdev3' 00:12:41.117 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:41.117 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:41.117 04:13:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:41.374 "name": "BaseBdev1", 00:12:41.374 "aliases": [ 00:12:41.374 "8c9462bc-39ad-4465-b65e-677a20974721" 00:12:41.374 ], 00:12:41.374 "product_name": "Malloc disk", 00:12:41.374 "block_size": 512, 00:12:41.374 "num_blocks": 65536, 00:12:41.374 "uuid": "8c9462bc-39ad-4465-b65e-677a20974721", 00:12:41.374 "assigned_rate_limits": { 00:12:41.374 "rw_ios_per_sec": 0, 00:12:41.374 "rw_mbytes_per_sec": 0, 00:12:41.374 "r_mbytes_per_sec": 0, 00:12:41.374 "w_mbytes_per_sec": 0 00:12:41.374 }, 00:12:41.374 "claimed": true, 00:12:41.374 "claim_type": "exclusive_write", 00:12:41.374 "zoned": false, 00:12:41.374 "supported_io_types": { 00:12:41.374 "read": true, 00:12:41.374 "write": true, 00:12:41.374 "unmap": true, 00:12:41.374 "write_zeroes": true, 00:12:41.374 "flush": true, 00:12:41.374 "reset": true, 00:12:41.374 "compare": false, 00:12:41.374 "compare_and_write": false, 00:12:41.374 "abort": true, 00:12:41.374 "nvme_admin": false, 00:12:41.374 "nvme_io": false 00:12:41.374 }, 00:12:41.374 "memory_domains": [ 00:12:41.374 { 00:12:41.374 "dma_device_id": "system", 00:12:41.374 "dma_device_type": 1 00:12:41.374 }, 00:12:41.374 { 00:12:41.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.374 "dma_device_type": 2 00:12:41.374 } 00:12:41.374 ], 00:12:41.374 "driver_specific": {} 00:12:41.374 }' 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.374 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:41.633 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:41.891 "name": "BaseBdev2", 00:12:41.891 "aliases": [ 00:12:41.891 "f457c505-4fc1-42eb-bdca-820fc7cbd34b" 00:12:41.891 ], 00:12:41.891 "product_name": "Malloc disk", 00:12:41.891 "block_size": 512, 00:12:41.891 "num_blocks": 65536, 00:12:41.891 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:41.891 "assigned_rate_limits": { 00:12:41.891 "rw_ios_per_sec": 0, 00:12:41.891 "rw_mbytes_per_sec": 0, 00:12:41.891 "r_mbytes_per_sec": 0, 00:12:41.891 "w_mbytes_per_sec": 0 00:12:41.891 }, 00:12:41.891 "claimed": true, 00:12:41.891 "claim_type": "exclusive_write", 00:12:41.891 "zoned": false, 00:12:41.891 "supported_io_types": { 00:12:41.891 "read": true, 00:12:41.891 "write": true, 00:12:41.891 "unmap": true, 00:12:41.891 "write_zeroes": true, 00:12:41.891 "flush": true, 00:12:41.891 "reset": true, 00:12:41.891 "compare": false, 00:12:41.891 "compare_and_write": false, 00:12:41.891 "abort": true, 00:12:41.891 "nvme_admin": false, 00:12:41.891 "nvme_io": false 00:12:41.891 }, 00:12:41.891 "memory_domains": [ 00:12:41.891 { 00:12:41.891 "dma_device_id": "system", 00:12:41.891 "dma_device_type": 1 00:12:41.891 }, 00:12:41.891 { 00:12:41.891 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.891 "dma_device_type": 2 00:12:41.891 } 00:12:41.891 ], 00:12:41.891 "driver_specific": {} 00:12:41.891 }' 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:41.891 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.150 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.150 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.150 04:13:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:42.150 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:42.407 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:42.408 "name": "BaseBdev3", 00:12:42.408 "aliases": [ 00:12:42.408 "0e633c27-b27d-4b19-bdf3-64a08f2393e5" 00:12:42.408 ], 00:12:42.408 "product_name": "Malloc disk", 00:12:42.408 "block_size": 512, 00:12:42.408 "num_blocks": 65536, 00:12:42.408 "uuid": "0e633c27-b27d-4b19-bdf3-64a08f2393e5", 00:12:42.408 "assigned_rate_limits": { 00:12:42.408 "rw_ios_per_sec": 0, 00:12:42.408 "rw_mbytes_per_sec": 0, 00:12:42.408 "r_mbytes_per_sec": 0, 00:12:42.408 "w_mbytes_per_sec": 0 00:12:42.408 }, 00:12:42.408 "claimed": true, 00:12:42.408 "claim_type": "exclusive_write", 00:12:42.408 "zoned": false, 00:12:42.408 "supported_io_types": { 00:12:42.408 "read": true, 00:12:42.408 "write": true, 00:12:42.408 "unmap": true, 00:12:42.408 "write_zeroes": true, 00:12:42.408 "flush": true, 00:12:42.408 "reset": true, 00:12:42.408 "compare": false, 00:12:42.408 "compare_and_write": false, 00:12:42.408 "abort": true, 00:12:42.408 "nvme_admin": false, 00:12:42.408 "nvme_io": false 00:12:42.408 }, 00:12:42.408 "memory_domains": [ 00:12:42.408 { 00:12:42.408 "dma_device_id": "system", 00:12:42.408 "dma_device_type": 1 00:12:42.408 }, 00:12:42.408 { 00:12:42.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.408 "dma_device_type": 2 00:12:42.408 } 00:12:42.408 ], 00:12:42.408 "driver_specific": {} 00:12:42.408 }' 00:12:42.408 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.408 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:42.666 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:42.924 [2024-05-15 04:13:30.876397] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:42.924 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.925 04:13:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.182 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:43.182 "name": "Existed_Raid", 00:12:43.182 "uuid": "4c75c8c5-bc8a-4965-8f27-ef88ce379290", 00:12:43.182 "strip_size_kb": 0, 00:12:43.182 "state": "online", 00:12:43.182 "raid_level": "raid1", 00:12:43.182 "superblock": false, 00:12:43.182 "num_base_bdevs": 3, 00:12:43.182 "num_base_bdevs_discovered": 2, 00:12:43.182 "num_base_bdevs_operational": 2, 00:12:43.182 "base_bdevs_list": [ 00:12:43.182 { 00:12:43.182 "name": null, 00:12:43.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.182 "is_configured": false, 00:12:43.182 "data_offset": 0, 00:12:43.182 "data_size": 65536 00:12:43.182 }, 00:12:43.182 { 00:12:43.182 "name": "BaseBdev2", 00:12:43.182 "uuid": "f457c505-4fc1-42eb-bdca-820fc7cbd34b", 00:12:43.182 "is_configured": true, 00:12:43.182 "data_offset": 0, 00:12:43.182 "data_size": 65536 00:12:43.182 }, 00:12:43.182 { 00:12:43.182 "name": "BaseBdev3", 00:12:43.182 "uuid": "0e633c27-b27d-4b19-bdf3-64a08f2393e5", 00:12:43.182 "is_configured": true, 00:12:43.182 "data_offset": 0, 00:12:43.182 "data_size": 65536 00:12:43.182 } 00:12:43.182 ] 00:12:43.182 }' 00:12:43.182 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:43.182 04:13:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.748 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:12:43.748 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:43.748 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.748 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:44.006 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:44.006 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:44.006 04:13:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:44.264 [2024-05-15 04:13:32.226462] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:44.264 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:44.264 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:44.264 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.264 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:12:44.522 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:12:44.522 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:44.522 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:44.780 [2024-05-15 04:13:32.760361] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:44.780 [2024-05-15 04:13:32.760479] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:44.780 [2024-05-15 04:13:32.774922] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:44.780 [2024-05-15 04:13:32.774987] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:44.780 [2024-05-15 04:13:32.775002] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12d57e0 name Existed_Raid, state offline 00:12:44.780 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:12:44.780 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:12:44.780 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.780 04:13:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:45.038 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:45.616 BaseBdev2 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:45.617 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:45.882 [ 00:12:45.882 { 00:12:45.882 "name": "BaseBdev2", 00:12:45.882 "aliases": [ 00:12:45.882 "e152fc40-6122-4849-a840-30fefbbfdefb" 00:12:45.882 ], 00:12:45.882 "product_name": "Malloc disk", 00:12:45.882 "block_size": 512, 00:12:45.882 "num_blocks": 65536, 00:12:45.882 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:45.882 "assigned_rate_limits": { 00:12:45.882 "rw_ios_per_sec": 0, 00:12:45.882 "rw_mbytes_per_sec": 0, 00:12:45.882 "r_mbytes_per_sec": 0, 00:12:45.882 "w_mbytes_per_sec": 0 00:12:45.882 }, 00:12:45.882 "claimed": false, 00:12:45.882 "zoned": false, 00:12:45.882 "supported_io_types": { 00:12:45.882 "read": true, 00:12:45.882 "write": true, 00:12:45.882 "unmap": true, 00:12:45.882 "write_zeroes": true, 00:12:45.882 "flush": true, 00:12:45.882 "reset": true, 00:12:45.882 "compare": false, 00:12:45.882 "compare_and_write": false, 00:12:45.882 "abort": true, 00:12:45.882 "nvme_admin": false, 00:12:45.882 "nvme_io": false 00:12:45.882 }, 00:12:45.882 "memory_domains": [ 00:12:45.882 { 00:12:45.882 "dma_device_id": "system", 00:12:45.882 "dma_device_type": 1 00:12:45.882 }, 00:12:45.882 { 00:12:45.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.882 "dma_device_type": 2 00:12:45.882 } 00:12:45.882 ], 00:12:45.882 "driver_specific": {} 00:12:45.882 } 00:12:45.882 ] 00:12:46.140 04:13:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:46.140 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:46.140 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:46.140 04:13:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:46.398 BaseBdev3 00:12:46.398 04:13:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:12:46.398 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:12:46.399 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:46.399 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:46.399 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:46.399 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:46.399 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:46.656 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:46.914 [ 00:12:46.914 { 00:12:46.914 "name": "BaseBdev3", 00:12:46.914 "aliases": [ 00:12:46.914 "6c4a941a-ff3b-4467-bab7-7de860a7bc03" 00:12:46.914 ], 00:12:46.914 "product_name": "Malloc disk", 00:12:46.914 "block_size": 512, 00:12:46.914 "num_blocks": 65536, 00:12:46.914 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:46.914 "assigned_rate_limits": { 00:12:46.914 "rw_ios_per_sec": 0, 00:12:46.914 "rw_mbytes_per_sec": 0, 00:12:46.914 "r_mbytes_per_sec": 0, 00:12:46.914 "w_mbytes_per_sec": 0 00:12:46.914 }, 00:12:46.914 "claimed": false, 00:12:46.914 "zoned": false, 00:12:46.914 "supported_io_types": { 00:12:46.914 "read": true, 00:12:46.914 "write": true, 00:12:46.914 "unmap": true, 00:12:46.914 "write_zeroes": true, 00:12:46.914 "flush": true, 00:12:46.914 "reset": true, 00:12:46.914 "compare": false, 00:12:46.914 "compare_and_write": false, 00:12:46.914 "abort": true, 00:12:46.914 "nvme_admin": false, 00:12:46.914 "nvme_io": false 00:12:46.914 }, 00:12:46.914 "memory_domains": [ 00:12:46.914 { 00:12:46.914 "dma_device_id": "system", 00:12:46.914 "dma_device_type": 1 00:12:46.914 }, 00:12:46.914 { 00:12:46.914 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.914 "dma_device_type": 2 00:12:46.914 } 00:12:46.914 ], 00:12:46.914 "driver_specific": {} 00:12:46.914 } 00:12:46.914 ] 00:12:46.914 04:13:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:46.914 04:13:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:12:46.914 04:13:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:12:46.914 04:13:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:47.172 [2024-05-15 04:13:35.036638] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:47.172 [2024-05-15 04:13:35.036684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:47.172 [2024-05-15 04:13:35.036709] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:47.172 [2024-05-15 04:13:35.038009] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.172 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.430 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:47.430 "name": "Existed_Raid", 00:12:47.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.430 "strip_size_kb": 0, 00:12:47.430 "state": "configuring", 00:12:47.430 "raid_level": "raid1", 00:12:47.430 "superblock": false, 00:12:47.430 "num_base_bdevs": 3, 00:12:47.430 "num_base_bdevs_discovered": 2, 00:12:47.430 "num_base_bdevs_operational": 3, 00:12:47.430 "base_bdevs_list": [ 00:12:47.430 { 00:12:47.430 "name": "BaseBdev1", 00:12:47.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:47.430 "is_configured": false, 00:12:47.430 "data_offset": 0, 00:12:47.430 "data_size": 0 00:12:47.430 }, 00:12:47.430 { 00:12:47.430 "name": "BaseBdev2", 00:12:47.430 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:47.430 "is_configured": true, 00:12:47.430 "data_offset": 0, 00:12:47.430 "data_size": 65536 00:12:47.430 }, 00:12:47.430 { 00:12:47.430 "name": "BaseBdev3", 00:12:47.430 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:47.430 "is_configured": true, 00:12:47.430 "data_offset": 0, 00:12:47.430 "data_size": 65536 00:12:47.430 } 00:12:47.430 ] 00:12:47.430 }' 00:12:47.430 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:47.430 04:13:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:47.996 04:13:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:48.254 [2024-05-15 04:13:36.071359] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.254 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.511 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:48.512 "name": "Existed_Raid", 00:12:48.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.512 "strip_size_kb": 0, 00:12:48.512 "state": "configuring", 00:12:48.512 "raid_level": "raid1", 00:12:48.512 "superblock": false, 00:12:48.512 "num_base_bdevs": 3, 00:12:48.512 "num_base_bdevs_discovered": 1, 00:12:48.512 "num_base_bdevs_operational": 3, 00:12:48.512 "base_bdevs_list": [ 00:12:48.512 { 00:12:48.512 "name": "BaseBdev1", 00:12:48.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.512 "is_configured": false, 00:12:48.512 "data_offset": 0, 00:12:48.512 "data_size": 0 00:12:48.512 }, 00:12:48.512 { 00:12:48.512 "name": null, 00:12:48.512 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:48.512 "is_configured": false, 00:12:48.512 "data_offset": 0, 00:12:48.512 "data_size": 65536 00:12:48.512 }, 00:12:48.512 { 00:12:48.512 "name": "BaseBdev3", 00:12:48.512 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:48.512 "is_configured": true, 00:12:48.512 "data_offset": 0, 00:12:48.512 "data_size": 65536 00:12:48.512 } 00:12:48.512 ] 00:12:48.512 }' 00:12:48.512 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:48.512 04:13:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.077 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.077 04:13:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:49.335 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:12:49.335 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:49.593 [2024-05-15 04:13:37.401404] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:49.593 BaseBdev1 00:12:49.593 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:12:49.593 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:12:49.593 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:49.594 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:49.594 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:49.594 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:49.594 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:49.851 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:50.110 [ 00:12:50.110 { 00:12:50.110 "name": "BaseBdev1", 00:12:50.110 "aliases": [ 00:12:50.110 "45ff1a6a-3668-4746-9b02-c56152fd9129" 00:12:50.110 ], 00:12:50.110 "product_name": "Malloc disk", 00:12:50.110 "block_size": 512, 00:12:50.110 "num_blocks": 65536, 00:12:50.110 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:50.110 "assigned_rate_limits": { 00:12:50.110 "rw_ios_per_sec": 0, 00:12:50.110 "rw_mbytes_per_sec": 0, 00:12:50.110 "r_mbytes_per_sec": 0, 00:12:50.110 "w_mbytes_per_sec": 0 00:12:50.110 }, 00:12:50.110 "claimed": true, 00:12:50.110 "claim_type": "exclusive_write", 00:12:50.110 "zoned": false, 00:12:50.110 "supported_io_types": { 00:12:50.110 "read": true, 00:12:50.110 "write": true, 00:12:50.110 "unmap": true, 00:12:50.110 "write_zeroes": true, 00:12:50.110 "flush": true, 00:12:50.110 "reset": true, 00:12:50.110 "compare": false, 00:12:50.110 "compare_and_write": false, 00:12:50.110 "abort": true, 00:12:50.110 "nvme_admin": false, 00:12:50.110 "nvme_io": false 00:12:50.110 }, 00:12:50.110 "memory_domains": [ 00:12:50.110 { 00:12:50.110 "dma_device_id": "system", 00:12:50.110 "dma_device_type": 1 00:12:50.110 }, 00:12:50.110 { 00:12:50.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.110 "dma_device_type": 2 00:12:50.110 } 00:12:50.110 ], 00:12:50.110 "driver_specific": {} 00:12:50.110 } 00:12:50.110 ] 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.110 04:13:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.368 04:13:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:50.368 "name": "Existed_Raid", 00:12:50.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:50.368 "strip_size_kb": 0, 00:12:50.368 "state": "configuring", 00:12:50.368 "raid_level": "raid1", 00:12:50.368 "superblock": false, 00:12:50.368 "num_base_bdevs": 3, 00:12:50.368 "num_base_bdevs_discovered": 2, 00:12:50.368 "num_base_bdevs_operational": 3, 00:12:50.368 "base_bdevs_list": [ 00:12:50.368 { 00:12:50.368 "name": "BaseBdev1", 00:12:50.368 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:50.368 "is_configured": true, 00:12:50.368 "data_offset": 0, 00:12:50.368 "data_size": 65536 00:12:50.368 }, 00:12:50.368 { 00:12:50.368 "name": null, 00:12:50.368 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:50.368 "is_configured": false, 00:12:50.368 "data_offset": 0, 00:12:50.368 "data_size": 65536 00:12:50.368 }, 00:12:50.368 { 00:12:50.368 "name": "BaseBdev3", 00:12:50.368 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:50.368 "is_configured": true, 00:12:50.368 "data_offset": 0, 00:12:50.368 "data_size": 65536 00:12:50.368 } 00:12:50.368 ] 00:12:50.368 }' 00:12:50.368 04:13:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:50.368 04:13:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.934 04:13:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.934 04:13:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:51.192 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:12:51.192 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:51.450 [2024-05-15 04:13:39.234301] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.450 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:51.708 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:51.708 "name": "Existed_Raid", 00:12:51.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:51.708 "strip_size_kb": 0, 00:12:51.708 "state": "configuring", 00:12:51.708 "raid_level": "raid1", 00:12:51.708 "superblock": false, 00:12:51.708 "num_base_bdevs": 3, 00:12:51.708 "num_base_bdevs_discovered": 1, 00:12:51.708 "num_base_bdevs_operational": 3, 00:12:51.708 "base_bdevs_list": [ 00:12:51.708 { 00:12:51.708 "name": "BaseBdev1", 00:12:51.708 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:51.708 "is_configured": true, 00:12:51.708 "data_offset": 0, 00:12:51.708 "data_size": 65536 00:12:51.708 }, 00:12:51.708 { 00:12:51.708 "name": null, 00:12:51.709 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:51.709 "is_configured": false, 00:12:51.709 "data_offset": 0, 00:12:51.709 "data_size": 65536 00:12:51.709 }, 00:12:51.709 { 00:12:51.709 "name": null, 00:12:51.709 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:51.709 "is_configured": false, 00:12:51.709 "data_offset": 0, 00:12:51.709 "data_size": 65536 00:12:51.709 } 00:12:51.709 ] 00:12:51.709 }' 00:12:51.709 04:13:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:51.709 04:13:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.275 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.275 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:52.533 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:12:52.533 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:52.791 [2024-05-15 04:13:40.601962] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:52.791 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.048 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:53.048 "name": "Existed_Raid", 00:12:53.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.048 "strip_size_kb": 0, 00:12:53.048 "state": "configuring", 00:12:53.048 "raid_level": "raid1", 00:12:53.048 "superblock": false, 00:12:53.048 "num_base_bdevs": 3, 00:12:53.048 "num_base_bdevs_discovered": 2, 00:12:53.048 "num_base_bdevs_operational": 3, 00:12:53.048 "base_bdevs_list": [ 00:12:53.048 { 00:12:53.048 "name": "BaseBdev1", 00:12:53.048 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:53.048 "is_configured": true, 00:12:53.048 "data_offset": 0, 00:12:53.048 "data_size": 65536 00:12:53.048 }, 00:12:53.048 { 00:12:53.048 "name": null, 00:12:53.048 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:53.048 "is_configured": false, 00:12:53.048 "data_offset": 0, 00:12:53.048 "data_size": 65536 00:12:53.048 }, 00:12:53.048 { 00:12:53.048 "name": "BaseBdev3", 00:12:53.048 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:53.048 "is_configured": true, 00:12:53.048 "data_offset": 0, 00:12:53.048 "data_size": 65536 00:12:53.048 } 00:12:53.048 ] 00:12:53.048 }' 00:12:53.048 04:13:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:53.048 04:13:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:53.613 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.613 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:53.871 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:12:53.871 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:54.129 [2024-05-15 04:13:41.929488] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.129 04:13:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:54.387 04:13:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:54.387 "name": "Existed_Raid", 00:12:54.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:54.387 "strip_size_kb": 0, 00:12:54.387 "state": "configuring", 00:12:54.387 "raid_level": "raid1", 00:12:54.387 "superblock": false, 00:12:54.387 "num_base_bdevs": 3, 00:12:54.387 "num_base_bdevs_discovered": 1, 00:12:54.387 "num_base_bdevs_operational": 3, 00:12:54.387 "base_bdevs_list": [ 00:12:54.387 { 00:12:54.387 "name": null, 00:12:54.387 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:54.387 "is_configured": false, 00:12:54.387 "data_offset": 0, 00:12:54.387 "data_size": 65536 00:12:54.387 }, 00:12:54.387 { 00:12:54.387 "name": null, 00:12:54.387 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:54.387 "is_configured": false, 00:12:54.387 "data_offset": 0, 00:12:54.387 "data_size": 65536 00:12:54.387 }, 00:12:54.387 { 00:12:54.387 "name": "BaseBdev3", 00:12:54.387 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:54.387 "is_configured": true, 00:12:54.387 "data_offset": 0, 00:12:54.387 "data_size": 65536 00:12:54.387 } 00:12:54.387 ] 00:12:54.387 }' 00:12:54.387 04:13:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:54.387 04:13:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.949 04:13:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.949 04:13:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:55.242 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:12:55.242 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:55.500 [2024-05-15 04:13:43.272513] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.500 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:55.756 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:55.756 "name": "Existed_Raid", 00:12:55.756 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:55.756 "strip_size_kb": 0, 00:12:55.756 "state": "configuring", 00:12:55.756 "raid_level": "raid1", 00:12:55.756 "superblock": false, 00:12:55.756 "num_base_bdevs": 3, 00:12:55.756 "num_base_bdevs_discovered": 2, 00:12:55.756 "num_base_bdevs_operational": 3, 00:12:55.756 "base_bdevs_list": [ 00:12:55.756 { 00:12:55.756 "name": null, 00:12:55.756 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:55.756 "is_configured": false, 00:12:55.756 "data_offset": 0, 00:12:55.756 "data_size": 65536 00:12:55.756 }, 00:12:55.756 { 00:12:55.756 "name": "BaseBdev2", 00:12:55.756 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:55.756 "is_configured": true, 00:12:55.756 "data_offset": 0, 00:12:55.757 "data_size": 65536 00:12:55.757 }, 00:12:55.757 { 00:12:55.757 "name": "BaseBdev3", 00:12:55.757 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:55.757 "is_configured": true, 00:12:55.757 "data_offset": 0, 00:12:55.757 "data_size": 65536 00:12:55.757 } 00:12:55.757 ] 00:12:55.757 }' 00:12:55.757 04:13:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:55.757 04:13:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.335 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.335 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:56.593 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:12:56.593 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.593 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:56.593 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 45ff1a6a-3668-4746-9b02-c56152fd9129 00:12:56.851 [2024-05-15 04:13:44.841919] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:56.851 [2024-05-15 04:13:44.841973] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1488b90 00:12:56.851 [2024-05-15 04:13:44.841983] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:56.851 [2024-05-15 04:13:44.842186] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12d4ff0 00:12:56.851 [2024-05-15 04:13:44.842318] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1488b90 00:12:56.851 [2024-05-15 04:13:44.842331] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1488b90 00:12:56.851 [2024-05-15 04:13:44.842541] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:56.851 NewBaseBdev 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:12:56.851 04:13:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:57.108 04:13:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:57.365 [ 00:12:57.365 { 00:12:57.365 "name": "NewBaseBdev", 00:12:57.365 "aliases": [ 00:12:57.365 "45ff1a6a-3668-4746-9b02-c56152fd9129" 00:12:57.365 ], 00:12:57.365 "product_name": "Malloc disk", 00:12:57.365 "block_size": 512, 00:12:57.365 "num_blocks": 65536, 00:12:57.365 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:57.365 "assigned_rate_limits": { 00:12:57.365 "rw_ios_per_sec": 0, 00:12:57.365 "rw_mbytes_per_sec": 0, 00:12:57.365 "r_mbytes_per_sec": 0, 00:12:57.365 "w_mbytes_per_sec": 0 00:12:57.365 }, 00:12:57.365 "claimed": true, 00:12:57.365 "claim_type": "exclusive_write", 00:12:57.365 "zoned": false, 00:12:57.365 "supported_io_types": { 00:12:57.365 "read": true, 00:12:57.365 "write": true, 00:12:57.365 "unmap": true, 00:12:57.365 "write_zeroes": true, 00:12:57.365 "flush": true, 00:12:57.365 "reset": true, 00:12:57.365 "compare": false, 00:12:57.365 "compare_and_write": false, 00:12:57.365 "abort": true, 00:12:57.365 "nvme_admin": false, 00:12:57.365 "nvme_io": false 00:12:57.365 }, 00:12:57.365 "memory_domains": [ 00:12:57.365 { 00:12:57.365 "dma_device_id": "system", 00:12:57.365 "dma_device_type": 1 00:12:57.365 }, 00:12:57.365 { 00:12:57.365 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.365 "dma_device_type": 2 00:12:57.365 } 00:12:57.365 ], 00:12:57.365 "driver_specific": {} 00:12:57.365 } 00:12:57.365 ] 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.365 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:57.623 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:12:57.623 "name": "Existed_Raid", 00:12:57.623 "uuid": "0281561f-1de4-4553-b572-e49d9e01038b", 00:12:57.623 "strip_size_kb": 0, 00:12:57.623 "state": "online", 00:12:57.623 "raid_level": "raid1", 00:12:57.623 "superblock": false, 00:12:57.623 "num_base_bdevs": 3, 00:12:57.623 "num_base_bdevs_discovered": 3, 00:12:57.623 "num_base_bdevs_operational": 3, 00:12:57.623 "base_bdevs_list": [ 00:12:57.623 { 00:12:57.623 "name": "NewBaseBdev", 00:12:57.623 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:57.623 "is_configured": true, 00:12:57.623 "data_offset": 0, 00:12:57.623 "data_size": 65536 00:12:57.623 }, 00:12:57.623 { 00:12:57.623 "name": "BaseBdev2", 00:12:57.623 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:57.623 "is_configured": true, 00:12:57.623 "data_offset": 0, 00:12:57.623 "data_size": 65536 00:12:57.623 }, 00:12:57.623 { 00:12:57.623 "name": "BaseBdev3", 00:12:57.623 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:57.623 "is_configured": true, 00:12:57.623 "data_offset": 0, 00:12:57.623 "data_size": 65536 00:12:57.623 } 00:12:57.623 ] 00:12:57.623 }' 00:12:57.623 04:13:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:12:57.623 04:13:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.187 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:58.188 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:12:58.445 [2024-05-15 04:13:46.394288] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:12:58.445 "name": "Existed_Raid", 00:12:58.445 "aliases": [ 00:12:58.445 "0281561f-1de4-4553-b572-e49d9e01038b" 00:12:58.445 ], 00:12:58.445 "product_name": "Raid Volume", 00:12:58.445 "block_size": 512, 00:12:58.445 "num_blocks": 65536, 00:12:58.445 "uuid": "0281561f-1de4-4553-b572-e49d9e01038b", 00:12:58.445 "assigned_rate_limits": { 00:12:58.445 "rw_ios_per_sec": 0, 00:12:58.445 "rw_mbytes_per_sec": 0, 00:12:58.445 "r_mbytes_per_sec": 0, 00:12:58.445 "w_mbytes_per_sec": 0 00:12:58.445 }, 00:12:58.445 "claimed": false, 00:12:58.445 "zoned": false, 00:12:58.445 "supported_io_types": { 00:12:58.445 "read": true, 00:12:58.445 "write": true, 00:12:58.445 "unmap": false, 00:12:58.445 "write_zeroes": true, 00:12:58.445 "flush": false, 00:12:58.445 "reset": true, 00:12:58.445 "compare": false, 00:12:58.445 "compare_and_write": false, 00:12:58.445 "abort": false, 00:12:58.445 "nvme_admin": false, 00:12:58.445 "nvme_io": false 00:12:58.445 }, 00:12:58.445 "memory_domains": [ 00:12:58.445 { 00:12:58.445 "dma_device_id": "system", 00:12:58.445 "dma_device_type": 1 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.445 "dma_device_type": 2 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "dma_device_id": "system", 00:12:58.445 "dma_device_type": 1 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.445 "dma_device_type": 2 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "dma_device_id": "system", 00:12:58.445 "dma_device_type": 1 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.445 "dma_device_type": 2 00:12:58.445 } 00:12:58.445 ], 00:12:58.445 "driver_specific": { 00:12:58.445 "raid": { 00:12:58.445 "uuid": "0281561f-1de4-4553-b572-e49d9e01038b", 00:12:58.445 "strip_size_kb": 0, 00:12:58.445 "state": "online", 00:12:58.445 "raid_level": "raid1", 00:12:58.445 "superblock": false, 00:12:58.445 "num_base_bdevs": 3, 00:12:58.445 "num_base_bdevs_discovered": 3, 00:12:58.445 "num_base_bdevs_operational": 3, 00:12:58.445 "base_bdevs_list": [ 00:12:58.445 { 00:12:58.445 "name": "NewBaseBdev", 00:12:58.445 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:58.445 "is_configured": true, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 65536 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "name": "BaseBdev2", 00:12:58.445 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:58.445 "is_configured": true, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 65536 00:12:58.445 }, 00:12:58.445 { 00:12:58.445 "name": "BaseBdev3", 00:12:58.445 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:58.445 "is_configured": true, 00:12:58.445 "data_offset": 0, 00:12:58.445 "data_size": 65536 00:12:58.445 } 00:12:58.445 ] 00:12:58.445 } 00:12:58.445 } 00:12:58.445 }' 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:12:58.445 BaseBdev2 00:12:58.445 BaseBdev3' 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:58.445 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:58.703 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:58.703 "name": "NewBaseBdev", 00:12:58.703 "aliases": [ 00:12:58.703 "45ff1a6a-3668-4746-9b02-c56152fd9129" 00:12:58.703 ], 00:12:58.703 "product_name": "Malloc disk", 00:12:58.703 "block_size": 512, 00:12:58.703 "num_blocks": 65536, 00:12:58.703 "uuid": "45ff1a6a-3668-4746-9b02-c56152fd9129", 00:12:58.703 "assigned_rate_limits": { 00:12:58.703 "rw_ios_per_sec": 0, 00:12:58.703 "rw_mbytes_per_sec": 0, 00:12:58.703 "r_mbytes_per_sec": 0, 00:12:58.703 "w_mbytes_per_sec": 0 00:12:58.703 }, 00:12:58.703 "claimed": true, 00:12:58.703 "claim_type": "exclusive_write", 00:12:58.703 "zoned": false, 00:12:58.703 "supported_io_types": { 00:12:58.703 "read": true, 00:12:58.703 "write": true, 00:12:58.703 "unmap": true, 00:12:58.703 "write_zeroes": true, 00:12:58.703 "flush": true, 00:12:58.703 "reset": true, 00:12:58.703 "compare": false, 00:12:58.703 "compare_and_write": false, 00:12:58.703 "abort": true, 00:12:58.703 "nvme_admin": false, 00:12:58.703 "nvme_io": false 00:12:58.703 }, 00:12:58.703 "memory_domains": [ 00:12:58.703 { 00:12:58.703 "dma_device_id": "system", 00:12:58.703 "dma_device_type": 1 00:12:58.703 }, 00:12:58.703 { 00:12:58.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.703 "dma_device_type": 2 00:12:58.703 } 00:12:58.703 ], 00:12:58.703 "driver_specific": {} 00:12:58.703 }' 00:12:58.703 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:58.960 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:59.218 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:59.218 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:59.218 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:59.218 04:13:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:59.218 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:59.218 "name": "BaseBdev2", 00:12:59.218 "aliases": [ 00:12:59.218 "e152fc40-6122-4849-a840-30fefbbfdefb" 00:12:59.218 ], 00:12:59.218 "product_name": "Malloc disk", 00:12:59.218 "block_size": 512, 00:12:59.218 "num_blocks": 65536, 00:12:59.218 "uuid": "e152fc40-6122-4849-a840-30fefbbfdefb", 00:12:59.218 "assigned_rate_limits": { 00:12:59.218 "rw_ios_per_sec": 0, 00:12:59.218 "rw_mbytes_per_sec": 0, 00:12:59.218 "r_mbytes_per_sec": 0, 00:12:59.218 "w_mbytes_per_sec": 0 00:12:59.218 }, 00:12:59.218 "claimed": true, 00:12:59.218 "claim_type": "exclusive_write", 00:12:59.218 "zoned": false, 00:12:59.218 "supported_io_types": { 00:12:59.218 "read": true, 00:12:59.218 "write": true, 00:12:59.218 "unmap": true, 00:12:59.218 "write_zeroes": true, 00:12:59.218 "flush": true, 00:12:59.218 "reset": true, 00:12:59.218 "compare": false, 00:12:59.218 "compare_and_write": false, 00:12:59.218 "abort": true, 00:12:59.218 "nvme_admin": false, 00:12:59.218 "nvme_io": false 00:12:59.218 }, 00:12:59.218 "memory_domains": [ 00:12:59.218 { 00:12:59.218 "dma_device_id": "system", 00:12:59.218 "dma_device_type": 1 00:12:59.218 }, 00:12:59.218 { 00:12:59.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.218 "dma_device_type": 2 00:12:59.218 } 00:12:59.218 ], 00:12:59.218 "driver_specific": {} 00:12:59.218 }' 00:12:59.218 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:59.475 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:59.732 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:12:59.732 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:12:59.732 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:12:59.732 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:59.732 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:12:59.990 "name": "BaseBdev3", 00:12:59.990 "aliases": [ 00:12:59.990 "6c4a941a-ff3b-4467-bab7-7de860a7bc03" 00:12:59.990 ], 00:12:59.990 "product_name": "Malloc disk", 00:12:59.990 "block_size": 512, 00:12:59.990 "num_blocks": 65536, 00:12:59.990 "uuid": "6c4a941a-ff3b-4467-bab7-7de860a7bc03", 00:12:59.990 "assigned_rate_limits": { 00:12:59.990 "rw_ios_per_sec": 0, 00:12:59.990 "rw_mbytes_per_sec": 0, 00:12:59.990 "r_mbytes_per_sec": 0, 00:12:59.990 "w_mbytes_per_sec": 0 00:12:59.990 }, 00:12:59.990 "claimed": true, 00:12:59.990 "claim_type": "exclusive_write", 00:12:59.990 "zoned": false, 00:12:59.990 "supported_io_types": { 00:12:59.990 "read": true, 00:12:59.990 "write": true, 00:12:59.990 "unmap": true, 00:12:59.990 "write_zeroes": true, 00:12:59.990 "flush": true, 00:12:59.990 "reset": true, 00:12:59.990 "compare": false, 00:12:59.990 "compare_and_write": false, 00:12:59.990 "abort": true, 00:12:59.990 "nvme_admin": false, 00:12:59.990 "nvme_io": false 00:12:59.990 }, 00:12:59.990 "memory_domains": [ 00:12:59.990 { 00:12:59.990 "dma_device_id": "system", 00:12:59.990 "dma_device_type": 1 00:12:59.990 }, 00:12:59.990 { 00:12:59.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:59.990 "dma_device_type": 2 00:12:59.990 } 00:12:59.990 ], 00:12:59.990 "driver_specific": {} 00:12:59.990 }' 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:59.990 04:13:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:00.249 04:13:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:00.249 04:13:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:00.249 04:13:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:00.507 [2024-05-15 04:13:48.279040] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:00.507 [2024-05-15 04:13:48.279071] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:00.507 [2024-05-15 04:13:48.279149] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:00.507 [2024-05-15 04:13:48.279418] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:00.507 [2024-05-15 04:13:48.279435] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1488b90 name Existed_Raid, state offline 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3858413 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3858413 ']' 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3858413 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3858413 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3858413' 00:13:00.507 killing process with pid 3858413 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3858413 00:13:00.507 [2024-05-15 04:13:48.331150] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:00.507 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3858413 00:13:00.507 [2024-05-15 04:13:48.367253] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:13:00.767 00:13:00.767 real 0m28.146s 00:13:00.767 user 0m52.771s 00:13:00.767 sys 0m3.936s 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.767 ************************************ 00:13:00.767 END TEST raid_state_function_test 00:13:00.767 ************************************ 00:13:00.767 04:13:48 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:13:00.767 04:13:48 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:00.767 04:13:48 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:00.767 04:13:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:00.767 ************************************ 00:13:00.767 START TEST raid_state_function_test_sb 00:13:00.767 ************************************ 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 3 true 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=3 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3862350 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3862350' 00:13:00.767 Process raid pid: 3862350 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3862350 /var/tmp/spdk-raid.sock 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3862350 ']' 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:00.767 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:00.767 04:13:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:00.767 [2024-05-15 04:13:48.764359] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:13:00.767 [2024-05-15 04:13:48.764429] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:01.025 [2024-05-15 04:13:48.848457] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:01.025 [2024-05-15 04:13:48.973249] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:01.284 [2024-05-15 04:13:49.051628] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.284 [2024-05-15 04:13:49.051671] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:01.850 04:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:01.850 04:13:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:13:01.850 04:13:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:02.108 [2024-05-15 04:13:50.016568] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:02.108 [2024-05-15 04:13:50.016621] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:02.108 [2024-05-15 04:13:50.016635] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:02.108 [2024-05-15 04:13:50.016649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:02.108 [2024-05-15 04:13:50.016658] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:02.108 [2024-05-15 04:13:50.016672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.108 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.366 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:02.366 "name": "Existed_Raid", 00:13:02.366 "uuid": "98a2774d-8792-41e6-a1c8-f6db76fd420a", 00:13:02.366 "strip_size_kb": 0, 00:13:02.366 "state": "configuring", 00:13:02.366 "raid_level": "raid1", 00:13:02.366 "superblock": true, 00:13:02.366 "num_base_bdevs": 3, 00:13:02.366 "num_base_bdevs_discovered": 0, 00:13:02.366 "num_base_bdevs_operational": 3, 00:13:02.366 "base_bdevs_list": [ 00:13:02.366 { 00:13:02.366 "name": "BaseBdev1", 00:13:02.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.366 "is_configured": false, 00:13:02.366 "data_offset": 0, 00:13:02.366 "data_size": 0 00:13:02.366 }, 00:13:02.366 { 00:13:02.366 "name": "BaseBdev2", 00:13:02.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.366 "is_configured": false, 00:13:02.366 "data_offset": 0, 00:13:02.366 "data_size": 0 00:13:02.366 }, 00:13:02.366 { 00:13:02.366 "name": "BaseBdev3", 00:13:02.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.366 "is_configured": false, 00:13:02.366 "data_offset": 0, 00:13:02.366 "data_size": 0 00:13:02.366 } 00:13:02.366 ] 00:13:02.366 }' 00:13:02.366 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:02.366 04:13:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:02.931 04:13:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:03.188 [2024-05-15 04:13:51.115385] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:03.188 [2024-05-15 04:13:51.115424] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf32020 name Existed_Raid, state configuring 00:13:03.188 04:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:03.446 [2024-05-15 04:13:51.376082] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:03.446 [2024-05-15 04:13:51.376133] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:03.446 [2024-05-15 04:13:51.376145] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:03.446 [2024-05-15 04:13:51.376158] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:03.446 [2024-05-15 04:13:51.376177] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:03.446 [2024-05-15 04:13:51.376189] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:03.446 04:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:03.704 [2024-05-15 04:13:51.633462] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.704 BaseBdev1 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:03.704 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:03.962 04:13:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:04.219 [ 00:13:04.219 { 00:13:04.219 "name": "BaseBdev1", 00:13:04.219 "aliases": [ 00:13:04.219 "b831e2ae-227d-4f9c-82ca-c49a433ecaa9" 00:13:04.219 ], 00:13:04.219 "product_name": "Malloc disk", 00:13:04.219 "block_size": 512, 00:13:04.219 "num_blocks": 65536, 00:13:04.219 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:04.219 "assigned_rate_limits": { 00:13:04.219 "rw_ios_per_sec": 0, 00:13:04.219 "rw_mbytes_per_sec": 0, 00:13:04.219 "r_mbytes_per_sec": 0, 00:13:04.219 "w_mbytes_per_sec": 0 00:13:04.219 }, 00:13:04.219 "claimed": true, 00:13:04.219 "claim_type": "exclusive_write", 00:13:04.219 "zoned": false, 00:13:04.219 "supported_io_types": { 00:13:04.219 "read": true, 00:13:04.219 "write": true, 00:13:04.219 "unmap": true, 00:13:04.219 "write_zeroes": true, 00:13:04.219 "flush": true, 00:13:04.219 "reset": true, 00:13:04.219 "compare": false, 00:13:04.219 "compare_and_write": false, 00:13:04.219 "abort": true, 00:13:04.219 "nvme_admin": false, 00:13:04.219 "nvme_io": false 00:13:04.219 }, 00:13:04.219 "memory_domains": [ 00:13:04.219 { 00:13:04.219 "dma_device_id": "system", 00:13:04.219 "dma_device_type": 1 00:13:04.219 }, 00:13:04.219 { 00:13:04.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.219 "dma_device_type": 2 00:13:04.219 } 00:13:04.219 ], 00:13:04.219 "driver_specific": {} 00:13:04.219 } 00:13:04.219 ] 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.219 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:04.477 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:04.477 "name": "Existed_Raid", 00:13:04.477 "uuid": "218c86b3-31fa-447a-a5e2-48276555450a", 00:13:04.477 "strip_size_kb": 0, 00:13:04.477 "state": "configuring", 00:13:04.477 "raid_level": "raid1", 00:13:04.477 "superblock": true, 00:13:04.477 "num_base_bdevs": 3, 00:13:04.477 "num_base_bdevs_discovered": 1, 00:13:04.477 "num_base_bdevs_operational": 3, 00:13:04.477 "base_bdevs_list": [ 00:13:04.477 { 00:13:04.477 "name": "BaseBdev1", 00:13:04.477 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:04.477 "is_configured": true, 00:13:04.477 "data_offset": 2048, 00:13:04.477 "data_size": 63488 00:13:04.477 }, 00:13:04.477 { 00:13:04.477 "name": "BaseBdev2", 00:13:04.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.477 "is_configured": false, 00:13:04.477 "data_offset": 0, 00:13:04.477 "data_size": 0 00:13:04.477 }, 00:13:04.477 { 00:13:04.477 "name": "BaseBdev3", 00:13:04.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:04.477 "is_configured": false, 00:13:04.478 "data_offset": 0, 00:13:04.478 "data_size": 0 00:13:04.478 } 00:13:04.478 ] 00:13:04.478 }' 00:13:04.478 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:04.478 04:13:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:05.042 04:13:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:05.301 [2024-05-15 04:13:53.137380] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:05.301 [2024-05-15 04:13:53.137437] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf318f0 name Existed_Raid, state configuring 00:13:05.301 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:05.579 [2024-05-15 04:13:53.374082] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:05.580 [2024-05-15 04:13:53.375489] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:05.580 [2024-05-15 04:13:53.375519] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:05.580 [2024-05-15 04:13:53.375530] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:05.580 [2024-05-15 04:13:53.375540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.580 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:05.838 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:05.838 "name": "Existed_Raid", 00:13:05.838 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:05.838 "strip_size_kb": 0, 00:13:05.838 "state": "configuring", 00:13:05.838 "raid_level": "raid1", 00:13:05.838 "superblock": true, 00:13:05.838 "num_base_bdevs": 3, 00:13:05.838 "num_base_bdevs_discovered": 1, 00:13:05.838 "num_base_bdevs_operational": 3, 00:13:05.838 "base_bdevs_list": [ 00:13:05.838 { 00:13:05.838 "name": "BaseBdev1", 00:13:05.838 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:05.838 "is_configured": true, 00:13:05.838 "data_offset": 2048, 00:13:05.838 "data_size": 63488 00:13:05.838 }, 00:13:05.838 { 00:13:05.838 "name": "BaseBdev2", 00:13:05.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.838 "is_configured": false, 00:13:05.838 "data_offset": 0, 00:13:05.838 "data_size": 0 00:13:05.838 }, 00:13:05.838 { 00:13:05.838 "name": "BaseBdev3", 00:13:05.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.838 "is_configured": false, 00:13:05.838 "data_offset": 0, 00:13:05.838 "data_size": 0 00:13:05.838 } 00:13:05.838 ] 00:13:05.838 }' 00:13:05.838 04:13:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:05.838 04:13:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:06.404 [2024-05-15 04:13:54.401542] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:06.404 BaseBdev2 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:06.404 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:06.662 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:06.920 [ 00:13:06.920 { 00:13:06.920 "name": "BaseBdev2", 00:13:06.920 "aliases": [ 00:13:06.920 "78d2dd61-34d2-4e1e-9298-742f7be19584" 00:13:06.920 ], 00:13:06.920 "product_name": "Malloc disk", 00:13:06.920 "block_size": 512, 00:13:06.920 "num_blocks": 65536, 00:13:06.920 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:06.920 "assigned_rate_limits": { 00:13:06.920 "rw_ios_per_sec": 0, 00:13:06.920 "rw_mbytes_per_sec": 0, 00:13:06.920 "r_mbytes_per_sec": 0, 00:13:06.920 "w_mbytes_per_sec": 0 00:13:06.920 }, 00:13:06.920 "claimed": true, 00:13:06.920 "claim_type": "exclusive_write", 00:13:06.920 "zoned": false, 00:13:06.920 "supported_io_types": { 00:13:06.920 "read": true, 00:13:06.920 "write": true, 00:13:06.920 "unmap": true, 00:13:06.920 "write_zeroes": true, 00:13:06.920 "flush": true, 00:13:06.920 "reset": true, 00:13:06.920 "compare": false, 00:13:06.920 "compare_and_write": false, 00:13:06.920 "abort": true, 00:13:06.920 "nvme_admin": false, 00:13:06.920 "nvme_io": false 00:13:06.920 }, 00:13:06.920 "memory_domains": [ 00:13:06.920 { 00:13:06.920 "dma_device_id": "system", 00:13:06.920 "dma_device_type": 1 00:13:06.920 }, 00:13:06.920 { 00:13:06.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.920 "dma_device_type": 2 00:13:06.920 } 00:13:06.920 ], 00:13:06.920 "driver_specific": {} 00:13:06.920 } 00:13:06.920 ] 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.920 04:13:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:07.179 04:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:07.179 "name": "Existed_Raid", 00:13:07.179 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:07.179 "strip_size_kb": 0, 00:13:07.179 "state": "configuring", 00:13:07.179 "raid_level": "raid1", 00:13:07.179 "superblock": true, 00:13:07.179 "num_base_bdevs": 3, 00:13:07.179 "num_base_bdevs_discovered": 2, 00:13:07.179 "num_base_bdevs_operational": 3, 00:13:07.179 "base_bdevs_list": [ 00:13:07.179 { 00:13:07.179 "name": "BaseBdev1", 00:13:07.179 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:07.179 "is_configured": true, 00:13:07.179 "data_offset": 2048, 00:13:07.179 "data_size": 63488 00:13:07.179 }, 00:13:07.179 { 00:13:07.179 "name": "BaseBdev2", 00:13:07.179 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:07.179 "is_configured": true, 00:13:07.179 "data_offset": 2048, 00:13:07.179 "data_size": 63488 00:13:07.179 }, 00:13:07.179 { 00:13:07.179 "name": "BaseBdev3", 00:13:07.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:07.179 "is_configured": false, 00:13:07.179 "data_offset": 0, 00:13:07.179 "data_size": 0 00:13:07.179 } 00:13:07.179 ] 00:13:07.179 }' 00:13:07.179 04:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:07.179 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:07.745 04:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:08.003 [2024-05-15 04:13:55.927554] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:08.003 [2024-05-15 04:13:55.927791] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xf327e0 00:13:08.003 [2024-05-15 04:13:55.927809] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:08.003 [2024-05-15 04:13:55.928003] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf496d0 00:13:08.003 [2024-05-15 04:13:55.928168] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf327e0 00:13:08.003 [2024-05-15 04:13:55.928184] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xf327e0 00:13:08.003 [2024-05-15 04:13:55.928304] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:08.003 BaseBdev3 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:08.003 04:13:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:08.261 04:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:08.518 [ 00:13:08.518 { 00:13:08.518 "name": "BaseBdev3", 00:13:08.518 "aliases": [ 00:13:08.518 "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193" 00:13:08.518 ], 00:13:08.518 "product_name": "Malloc disk", 00:13:08.518 "block_size": 512, 00:13:08.518 "num_blocks": 65536, 00:13:08.518 "uuid": "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193", 00:13:08.518 "assigned_rate_limits": { 00:13:08.518 "rw_ios_per_sec": 0, 00:13:08.518 "rw_mbytes_per_sec": 0, 00:13:08.518 "r_mbytes_per_sec": 0, 00:13:08.518 "w_mbytes_per_sec": 0 00:13:08.518 }, 00:13:08.518 "claimed": true, 00:13:08.518 "claim_type": "exclusive_write", 00:13:08.518 "zoned": false, 00:13:08.518 "supported_io_types": { 00:13:08.518 "read": true, 00:13:08.518 "write": true, 00:13:08.518 "unmap": true, 00:13:08.518 "write_zeroes": true, 00:13:08.518 "flush": true, 00:13:08.518 "reset": true, 00:13:08.518 "compare": false, 00:13:08.518 "compare_and_write": false, 00:13:08.518 "abort": true, 00:13:08.518 "nvme_admin": false, 00:13:08.518 "nvme_io": false 00:13:08.518 }, 00:13:08.518 "memory_domains": [ 00:13:08.518 { 00:13:08.518 "dma_device_id": "system", 00:13:08.518 "dma_device_type": 1 00:13:08.518 }, 00:13:08.518 { 00:13:08.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:08.518 "dma_device_type": 2 00:13:08.518 } 00:13:08.518 ], 00:13:08.519 "driver_specific": {} 00:13:08.519 } 00:13:08.519 ] 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.519 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.777 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:08.777 "name": "Existed_Raid", 00:13:08.777 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:08.777 "strip_size_kb": 0, 00:13:08.777 "state": "online", 00:13:08.777 "raid_level": "raid1", 00:13:08.777 "superblock": true, 00:13:08.777 "num_base_bdevs": 3, 00:13:08.777 "num_base_bdevs_discovered": 3, 00:13:08.777 "num_base_bdevs_operational": 3, 00:13:08.777 "base_bdevs_list": [ 00:13:08.777 { 00:13:08.777 "name": "BaseBdev1", 00:13:08.777 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:08.777 "is_configured": true, 00:13:08.777 "data_offset": 2048, 00:13:08.777 "data_size": 63488 00:13:08.777 }, 00:13:08.777 { 00:13:08.777 "name": "BaseBdev2", 00:13:08.777 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:08.777 "is_configured": true, 00:13:08.777 "data_offset": 2048, 00:13:08.777 "data_size": 63488 00:13:08.777 }, 00:13:08.777 { 00:13:08.777 "name": "BaseBdev3", 00:13:08.777 "uuid": "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193", 00:13:08.777 "is_configured": true, 00:13:08.777 "data_offset": 2048, 00:13:08.777 "data_size": 63488 00:13:08.777 } 00:13:08.777 ] 00:13:08.777 }' 00:13:08.777 04:13:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:08.777 04:13:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:09.342 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:09.600 [2024-05-15 04:13:57.463874] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:09.600 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:09.600 "name": "Existed_Raid", 00:13:09.600 "aliases": [ 00:13:09.600 "18dcc83d-059e-4cca-8b18-88454c14398c" 00:13:09.600 ], 00:13:09.600 "product_name": "Raid Volume", 00:13:09.600 "block_size": 512, 00:13:09.600 "num_blocks": 63488, 00:13:09.600 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:09.600 "assigned_rate_limits": { 00:13:09.600 "rw_ios_per_sec": 0, 00:13:09.600 "rw_mbytes_per_sec": 0, 00:13:09.600 "r_mbytes_per_sec": 0, 00:13:09.600 "w_mbytes_per_sec": 0 00:13:09.600 }, 00:13:09.600 "claimed": false, 00:13:09.600 "zoned": false, 00:13:09.600 "supported_io_types": { 00:13:09.600 "read": true, 00:13:09.600 "write": true, 00:13:09.600 "unmap": false, 00:13:09.600 "write_zeroes": true, 00:13:09.600 "flush": false, 00:13:09.600 "reset": true, 00:13:09.600 "compare": false, 00:13:09.600 "compare_and_write": false, 00:13:09.600 "abort": false, 00:13:09.600 "nvme_admin": false, 00:13:09.600 "nvme_io": false 00:13:09.600 }, 00:13:09.600 "memory_domains": [ 00:13:09.600 { 00:13:09.600 "dma_device_id": "system", 00:13:09.600 "dma_device_type": 1 00:13:09.600 }, 00:13:09.600 { 00:13:09.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.600 "dma_device_type": 2 00:13:09.600 }, 00:13:09.600 { 00:13:09.600 "dma_device_id": "system", 00:13:09.600 "dma_device_type": 1 00:13:09.600 }, 00:13:09.600 { 00:13:09.600 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.600 "dma_device_type": 2 00:13:09.600 }, 00:13:09.600 { 00:13:09.601 "dma_device_id": "system", 00:13:09.601 "dma_device_type": 1 00:13:09.601 }, 00:13:09.601 { 00:13:09.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.601 "dma_device_type": 2 00:13:09.601 } 00:13:09.601 ], 00:13:09.601 "driver_specific": { 00:13:09.601 "raid": { 00:13:09.601 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:09.601 "strip_size_kb": 0, 00:13:09.601 "state": "online", 00:13:09.601 "raid_level": "raid1", 00:13:09.601 "superblock": true, 00:13:09.601 "num_base_bdevs": 3, 00:13:09.601 "num_base_bdevs_discovered": 3, 00:13:09.601 "num_base_bdevs_operational": 3, 00:13:09.601 "base_bdevs_list": [ 00:13:09.601 { 00:13:09.601 "name": "BaseBdev1", 00:13:09.601 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:09.601 "is_configured": true, 00:13:09.601 "data_offset": 2048, 00:13:09.601 "data_size": 63488 00:13:09.601 }, 00:13:09.601 { 00:13:09.601 "name": "BaseBdev2", 00:13:09.601 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:09.601 "is_configured": true, 00:13:09.601 "data_offset": 2048, 00:13:09.601 "data_size": 63488 00:13:09.601 }, 00:13:09.601 { 00:13:09.601 "name": "BaseBdev3", 00:13:09.601 "uuid": "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193", 00:13:09.601 "is_configured": true, 00:13:09.601 "data_offset": 2048, 00:13:09.601 "data_size": 63488 00:13:09.601 } 00:13:09.601 ] 00:13:09.601 } 00:13:09.601 } 00:13:09.601 }' 00:13:09.601 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:09.601 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:13:09.601 BaseBdev2 00:13:09.601 BaseBdev3' 00:13:09.601 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:09.601 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:09.601 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:09.859 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:09.859 "name": "BaseBdev1", 00:13:09.859 "aliases": [ 00:13:09.859 "b831e2ae-227d-4f9c-82ca-c49a433ecaa9" 00:13:09.859 ], 00:13:09.859 "product_name": "Malloc disk", 00:13:09.859 "block_size": 512, 00:13:09.859 "num_blocks": 65536, 00:13:09.859 "uuid": "b831e2ae-227d-4f9c-82ca-c49a433ecaa9", 00:13:09.859 "assigned_rate_limits": { 00:13:09.859 "rw_ios_per_sec": 0, 00:13:09.859 "rw_mbytes_per_sec": 0, 00:13:09.859 "r_mbytes_per_sec": 0, 00:13:09.859 "w_mbytes_per_sec": 0 00:13:09.859 }, 00:13:09.859 "claimed": true, 00:13:09.859 "claim_type": "exclusive_write", 00:13:09.859 "zoned": false, 00:13:09.859 "supported_io_types": { 00:13:09.859 "read": true, 00:13:09.859 "write": true, 00:13:09.859 "unmap": true, 00:13:09.859 "write_zeroes": true, 00:13:09.859 "flush": true, 00:13:09.859 "reset": true, 00:13:09.859 "compare": false, 00:13:09.859 "compare_and_write": false, 00:13:09.859 "abort": true, 00:13:09.859 "nvme_admin": false, 00:13:09.859 "nvme_io": false 00:13:09.859 }, 00:13:09.859 "memory_domains": [ 00:13:09.859 { 00:13:09.859 "dma_device_id": "system", 00:13:09.859 "dma_device_type": 1 00:13:09.859 }, 00:13:09.859 { 00:13:09.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:09.859 "dma_device_type": 2 00:13:09.859 } 00:13:09.859 ], 00:13:09.859 "driver_specific": {} 00:13:09.859 }' 00:13:09.859 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:09.859 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:09.859 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:09.859 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.118 04:13:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.118 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.118 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:10.118 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:10.118 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:10.118 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:10.377 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:10.377 "name": "BaseBdev2", 00:13:10.377 "aliases": [ 00:13:10.377 "78d2dd61-34d2-4e1e-9298-742f7be19584" 00:13:10.377 ], 00:13:10.377 "product_name": "Malloc disk", 00:13:10.377 "block_size": 512, 00:13:10.377 "num_blocks": 65536, 00:13:10.377 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:10.377 "assigned_rate_limits": { 00:13:10.377 "rw_ios_per_sec": 0, 00:13:10.377 "rw_mbytes_per_sec": 0, 00:13:10.377 "r_mbytes_per_sec": 0, 00:13:10.377 "w_mbytes_per_sec": 0 00:13:10.377 }, 00:13:10.377 "claimed": true, 00:13:10.377 "claim_type": "exclusive_write", 00:13:10.377 "zoned": false, 00:13:10.377 "supported_io_types": { 00:13:10.377 "read": true, 00:13:10.377 "write": true, 00:13:10.377 "unmap": true, 00:13:10.377 "write_zeroes": true, 00:13:10.377 "flush": true, 00:13:10.377 "reset": true, 00:13:10.377 "compare": false, 00:13:10.377 "compare_and_write": false, 00:13:10.377 "abort": true, 00:13:10.377 "nvme_admin": false, 00:13:10.377 "nvme_io": false 00:13:10.377 }, 00:13:10.377 "memory_domains": [ 00:13:10.377 { 00:13:10.377 "dma_device_id": "system", 00:13:10.377 "dma_device_type": 1 00:13:10.377 }, 00:13:10.377 { 00:13:10.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.377 "dma_device_type": 2 00:13:10.377 } 00:13:10.377 ], 00:13:10.377 "driver_specific": {} 00:13:10.377 }' 00:13:10.377 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.377 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.377 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:10.377 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:10.635 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:10.893 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:10.893 "name": "BaseBdev3", 00:13:10.893 "aliases": [ 00:13:10.893 "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193" 00:13:10.893 ], 00:13:10.893 "product_name": "Malloc disk", 00:13:10.893 "block_size": 512, 00:13:10.893 "num_blocks": 65536, 00:13:10.893 "uuid": "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193", 00:13:10.893 "assigned_rate_limits": { 00:13:10.893 "rw_ios_per_sec": 0, 00:13:10.893 "rw_mbytes_per_sec": 0, 00:13:10.893 "r_mbytes_per_sec": 0, 00:13:10.893 "w_mbytes_per_sec": 0 00:13:10.893 }, 00:13:10.893 "claimed": true, 00:13:10.893 "claim_type": "exclusive_write", 00:13:10.893 "zoned": false, 00:13:10.893 "supported_io_types": { 00:13:10.893 "read": true, 00:13:10.893 "write": true, 00:13:10.893 "unmap": true, 00:13:10.893 "write_zeroes": true, 00:13:10.893 "flush": true, 00:13:10.893 "reset": true, 00:13:10.893 "compare": false, 00:13:10.893 "compare_and_write": false, 00:13:10.893 "abort": true, 00:13:10.893 "nvme_admin": false, 00:13:10.893 "nvme_io": false 00:13:10.893 }, 00:13:10.893 "memory_domains": [ 00:13:10.893 { 00:13:10.893 "dma_device_id": "system", 00:13:10.893 "dma_device_type": 1 00:13:10.893 }, 00:13:10.893 { 00:13:10.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:10.893 "dma_device_type": 2 00:13:10.893 } 00:13:10.893 ], 00:13:10.893 "driver_specific": {} 00:13:10.893 }' 00:13:10.893 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:10.893 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:11.164 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:11.164 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:11.164 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:11.164 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:11.164 04:13:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:11.164 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:11.422 [2024-05-15 04:13:59.368851] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.422 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.680 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:11.680 "name": "Existed_Raid", 00:13:11.680 "uuid": "18dcc83d-059e-4cca-8b18-88454c14398c", 00:13:11.680 "strip_size_kb": 0, 00:13:11.680 "state": "online", 00:13:11.680 "raid_level": "raid1", 00:13:11.680 "superblock": true, 00:13:11.680 "num_base_bdevs": 3, 00:13:11.680 "num_base_bdevs_discovered": 2, 00:13:11.680 "num_base_bdevs_operational": 2, 00:13:11.680 "base_bdevs_list": [ 00:13:11.680 { 00:13:11.680 "name": null, 00:13:11.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.680 "is_configured": false, 00:13:11.680 "data_offset": 2048, 00:13:11.680 "data_size": 63488 00:13:11.680 }, 00:13:11.680 { 00:13:11.680 "name": "BaseBdev2", 00:13:11.680 "uuid": "78d2dd61-34d2-4e1e-9298-742f7be19584", 00:13:11.680 "is_configured": true, 00:13:11.680 "data_offset": 2048, 00:13:11.680 "data_size": 63488 00:13:11.680 }, 00:13:11.680 { 00:13:11.680 "name": "BaseBdev3", 00:13:11.680 "uuid": "c8e5d26e-9de3-4c7d-b84f-c3edb8f70193", 00:13:11.680 "is_configured": true, 00:13:11.680 "data_offset": 2048, 00:13:11.680 "data_size": 63488 00:13:11.680 } 00:13:11.680 ] 00:13:11.680 }' 00:13:11.680 04:13:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:11.680 04:13:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.246 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:13:12.246 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:12.246 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.246 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:12.504 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:12.504 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:12.504 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:12.762 [2024-05-15 04:14:00.670657] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:12.762 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:12.762 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:12.762 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.762 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:13:13.019 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:13:13.020 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:13.020 04:14:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:13.278 [2024-05-15 04:14:01.209042] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:13.278 [2024-05-15 04:14:01.209173] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:13.278 [2024-05-15 04:14:01.221541] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:13.278 [2024-05-15 04:14:01.221615] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:13.278 [2024-05-15 04:14:01.221629] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf327e0 name Existed_Raid, state offline 00:13:13.278 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:13:13.278 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:13:13.278 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.278 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 3 -gt 2 ']' 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:13.535 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:13.826 BaseBdev2 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:13.826 04:14:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.086 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:14.345 [ 00:13:14.345 { 00:13:14.345 "name": "BaseBdev2", 00:13:14.345 "aliases": [ 00:13:14.345 "a2e41db0-adda-4199-be7c-ab1efb1e779b" 00:13:14.345 ], 00:13:14.345 "product_name": "Malloc disk", 00:13:14.345 "block_size": 512, 00:13:14.345 "num_blocks": 65536, 00:13:14.345 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:14.345 "assigned_rate_limits": { 00:13:14.345 "rw_ios_per_sec": 0, 00:13:14.345 "rw_mbytes_per_sec": 0, 00:13:14.345 "r_mbytes_per_sec": 0, 00:13:14.345 "w_mbytes_per_sec": 0 00:13:14.345 }, 00:13:14.345 "claimed": false, 00:13:14.345 "zoned": false, 00:13:14.345 "supported_io_types": { 00:13:14.345 "read": true, 00:13:14.345 "write": true, 00:13:14.345 "unmap": true, 00:13:14.345 "write_zeroes": true, 00:13:14.345 "flush": true, 00:13:14.345 "reset": true, 00:13:14.345 "compare": false, 00:13:14.345 "compare_and_write": false, 00:13:14.345 "abort": true, 00:13:14.345 "nvme_admin": false, 00:13:14.345 "nvme_io": false 00:13:14.345 }, 00:13:14.345 "memory_domains": [ 00:13:14.345 { 00:13:14.345 "dma_device_id": "system", 00:13:14.345 "dma_device_type": 1 00:13:14.345 }, 00:13:14.345 { 00:13:14.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.345 "dma_device_type": 2 00:13:14.345 } 00:13:14.345 ], 00:13:14.345 "driver_specific": {} 00:13:14.345 } 00:13:14.345 ] 00:13:14.345 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:14.345 04:14:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:14.345 04:14:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:14.345 04:14:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:14.603 BaseBdev3 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:14.603 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.861 04:14:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:15.120 [ 00:13:15.120 { 00:13:15.120 "name": "BaseBdev3", 00:13:15.120 "aliases": [ 00:13:15.120 "c3672ba5-a087-4676-829b-02cb99cedb82" 00:13:15.121 ], 00:13:15.121 "product_name": "Malloc disk", 00:13:15.121 "block_size": 512, 00:13:15.121 "num_blocks": 65536, 00:13:15.121 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:15.121 "assigned_rate_limits": { 00:13:15.121 "rw_ios_per_sec": 0, 00:13:15.121 "rw_mbytes_per_sec": 0, 00:13:15.121 "r_mbytes_per_sec": 0, 00:13:15.121 "w_mbytes_per_sec": 0 00:13:15.121 }, 00:13:15.121 "claimed": false, 00:13:15.121 "zoned": false, 00:13:15.121 "supported_io_types": { 00:13:15.121 "read": true, 00:13:15.121 "write": true, 00:13:15.121 "unmap": true, 00:13:15.121 "write_zeroes": true, 00:13:15.121 "flush": true, 00:13:15.121 "reset": true, 00:13:15.121 "compare": false, 00:13:15.121 "compare_and_write": false, 00:13:15.121 "abort": true, 00:13:15.121 "nvme_admin": false, 00:13:15.121 "nvme_io": false 00:13:15.121 }, 00:13:15.121 "memory_domains": [ 00:13:15.121 { 00:13:15.121 "dma_device_id": "system", 00:13:15.121 "dma_device_type": 1 00:13:15.121 }, 00:13:15.121 { 00:13:15.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:15.121 "dma_device_type": 2 00:13:15.121 } 00:13:15.121 ], 00:13:15.121 "driver_specific": {} 00:13:15.121 } 00:13:15.121 ] 00:13:15.121 04:14:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:15.121 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:13:15.121 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:13:15.121 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:15.379 [2024-05-15 04:14:03.270366] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:15.379 [2024-05-15 04:14:03.270413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:15.379 [2024-05-15 04:14:03.270438] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:15.379 [2024-05-15 04:14:03.271707] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.379 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.636 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:15.636 "name": "Existed_Raid", 00:13:15.636 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:15.636 "strip_size_kb": 0, 00:13:15.636 "state": "configuring", 00:13:15.636 "raid_level": "raid1", 00:13:15.636 "superblock": true, 00:13:15.636 "num_base_bdevs": 3, 00:13:15.636 "num_base_bdevs_discovered": 2, 00:13:15.636 "num_base_bdevs_operational": 3, 00:13:15.636 "base_bdevs_list": [ 00:13:15.636 { 00:13:15.636 "name": "BaseBdev1", 00:13:15.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.636 "is_configured": false, 00:13:15.636 "data_offset": 0, 00:13:15.636 "data_size": 0 00:13:15.636 }, 00:13:15.636 { 00:13:15.636 "name": "BaseBdev2", 00:13:15.636 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:15.636 "is_configured": true, 00:13:15.636 "data_offset": 2048, 00:13:15.636 "data_size": 63488 00:13:15.636 }, 00:13:15.636 { 00:13:15.636 "name": "BaseBdev3", 00:13:15.636 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:15.636 "is_configured": true, 00:13:15.636 "data_offset": 2048, 00:13:15.636 "data_size": 63488 00:13:15.636 } 00:13:15.636 ] 00:13:15.636 }' 00:13:15.636 04:14:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:15.636 04:14:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:16.201 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:16.459 [2024-05-15 04:14:04.293051] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.459 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.717 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:16.717 "name": "Existed_Raid", 00:13:16.717 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:16.717 "strip_size_kb": 0, 00:13:16.717 "state": "configuring", 00:13:16.717 "raid_level": "raid1", 00:13:16.717 "superblock": true, 00:13:16.717 "num_base_bdevs": 3, 00:13:16.717 "num_base_bdevs_discovered": 1, 00:13:16.717 "num_base_bdevs_operational": 3, 00:13:16.717 "base_bdevs_list": [ 00:13:16.717 { 00:13:16.717 "name": "BaseBdev1", 00:13:16.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:16.717 "is_configured": false, 00:13:16.717 "data_offset": 0, 00:13:16.717 "data_size": 0 00:13:16.717 }, 00:13:16.717 { 00:13:16.717 "name": null, 00:13:16.717 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:16.717 "is_configured": false, 00:13:16.717 "data_offset": 2048, 00:13:16.717 "data_size": 63488 00:13:16.717 }, 00:13:16.717 { 00:13:16.717 "name": "BaseBdev3", 00:13:16.717 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:16.717 "is_configured": true, 00:13:16.717 "data_offset": 2048, 00:13:16.717 "data_size": 63488 00:13:16.717 } 00:13:16.717 ] 00:13:16.717 }' 00:13:16.717 04:14:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:16.717 04:14:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.283 04:14:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.283 04:14:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:17.546 04:14:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:13:17.546 04:14:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:17.804 [2024-05-15 04:14:05.569551] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:17.804 BaseBdev1 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:17.804 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:18.062 04:14:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:18.320 [ 00:13:18.320 { 00:13:18.320 "name": "BaseBdev1", 00:13:18.320 "aliases": [ 00:13:18.320 "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c" 00:13:18.320 ], 00:13:18.320 "product_name": "Malloc disk", 00:13:18.320 "block_size": 512, 00:13:18.320 "num_blocks": 65536, 00:13:18.320 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:18.320 "assigned_rate_limits": { 00:13:18.320 "rw_ios_per_sec": 0, 00:13:18.320 "rw_mbytes_per_sec": 0, 00:13:18.320 "r_mbytes_per_sec": 0, 00:13:18.320 "w_mbytes_per_sec": 0 00:13:18.320 }, 00:13:18.320 "claimed": true, 00:13:18.320 "claim_type": "exclusive_write", 00:13:18.320 "zoned": false, 00:13:18.320 "supported_io_types": { 00:13:18.320 "read": true, 00:13:18.320 "write": true, 00:13:18.320 "unmap": true, 00:13:18.320 "write_zeroes": true, 00:13:18.320 "flush": true, 00:13:18.320 "reset": true, 00:13:18.320 "compare": false, 00:13:18.320 "compare_and_write": false, 00:13:18.320 "abort": true, 00:13:18.320 "nvme_admin": false, 00:13:18.320 "nvme_io": false 00:13:18.320 }, 00:13:18.320 "memory_domains": [ 00:13:18.320 { 00:13:18.320 "dma_device_id": "system", 00:13:18.320 "dma_device_type": 1 00:13:18.320 }, 00:13:18.320 { 00:13:18.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.320 "dma_device_type": 2 00:13:18.320 } 00:13:18.320 ], 00:13:18.320 "driver_specific": {} 00:13:18.320 } 00:13:18.320 ] 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.320 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.578 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:18.578 "name": "Existed_Raid", 00:13:18.578 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:18.578 "strip_size_kb": 0, 00:13:18.578 "state": "configuring", 00:13:18.578 "raid_level": "raid1", 00:13:18.578 "superblock": true, 00:13:18.578 "num_base_bdevs": 3, 00:13:18.578 "num_base_bdevs_discovered": 2, 00:13:18.578 "num_base_bdevs_operational": 3, 00:13:18.578 "base_bdevs_list": [ 00:13:18.578 { 00:13:18.578 "name": "BaseBdev1", 00:13:18.578 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:18.578 "is_configured": true, 00:13:18.578 "data_offset": 2048, 00:13:18.578 "data_size": 63488 00:13:18.578 }, 00:13:18.578 { 00:13:18.578 "name": null, 00:13:18.578 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:18.578 "is_configured": false, 00:13:18.578 "data_offset": 2048, 00:13:18.578 "data_size": 63488 00:13:18.578 }, 00:13:18.578 { 00:13:18.578 "name": "BaseBdev3", 00:13:18.578 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:18.578 "is_configured": true, 00:13:18.578 "data_offset": 2048, 00:13:18.578 "data_size": 63488 00:13:18.578 } 00:13:18.578 ] 00:13:18.578 }' 00:13:18.578 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:18.578 04:14:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:19.144 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.144 04:14:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:19.144 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:13:19.144 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:19.402 [2024-05-15 04:14:07.362335] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.402 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.660 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:19.660 "name": "Existed_Raid", 00:13:19.660 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:19.660 "strip_size_kb": 0, 00:13:19.660 "state": "configuring", 00:13:19.660 "raid_level": "raid1", 00:13:19.660 "superblock": true, 00:13:19.660 "num_base_bdevs": 3, 00:13:19.660 "num_base_bdevs_discovered": 1, 00:13:19.660 "num_base_bdevs_operational": 3, 00:13:19.660 "base_bdevs_list": [ 00:13:19.660 { 00:13:19.660 "name": "BaseBdev1", 00:13:19.660 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:19.660 "is_configured": true, 00:13:19.660 "data_offset": 2048, 00:13:19.660 "data_size": 63488 00:13:19.660 }, 00:13:19.660 { 00:13:19.660 "name": null, 00:13:19.660 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:19.660 "is_configured": false, 00:13:19.660 "data_offset": 2048, 00:13:19.660 "data_size": 63488 00:13:19.660 }, 00:13:19.660 { 00:13:19.660 "name": null, 00:13:19.660 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:19.660 "is_configured": false, 00:13:19.660 "data_offset": 2048, 00:13:19.660 "data_size": 63488 00:13:19.660 } 00:13:19.660 ] 00:13:19.660 }' 00:13:19.660 04:14:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:19.660 04:14:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:20.225 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.225 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:20.483 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:13:20.483 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:20.741 [2024-05-15 04:14:08.629742] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.741 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.999 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:20.999 "name": "Existed_Raid", 00:13:20.999 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:20.999 "strip_size_kb": 0, 00:13:20.999 "state": "configuring", 00:13:20.999 "raid_level": "raid1", 00:13:20.999 "superblock": true, 00:13:20.999 "num_base_bdevs": 3, 00:13:20.999 "num_base_bdevs_discovered": 2, 00:13:20.999 "num_base_bdevs_operational": 3, 00:13:20.999 "base_bdevs_list": [ 00:13:20.999 { 00:13:20.999 "name": "BaseBdev1", 00:13:20.999 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:20.999 "is_configured": true, 00:13:20.999 "data_offset": 2048, 00:13:20.999 "data_size": 63488 00:13:20.999 }, 00:13:20.999 { 00:13:20.999 "name": null, 00:13:20.999 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:20.999 "is_configured": false, 00:13:20.999 "data_offset": 2048, 00:13:20.999 "data_size": 63488 00:13:20.999 }, 00:13:20.999 { 00:13:20.999 "name": "BaseBdev3", 00:13:20.999 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:20.999 "is_configured": true, 00:13:20.999 "data_offset": 2048, 00:13:20.999 "data_size": 63488 00:13:20.999 } 00:13:20.999 ] 00:13:20.999 }' 00:13:20.999 04:14:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:20.999 04:14:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.564 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.564 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:21.821 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:13:21.821 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:22.079 [2024-05-15 04:14:09.905174] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.079 04:14:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:22.337 04:14:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:22.337 "name": "Existed_Raid", 00:13:22.337 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:22.337 "strip_size_kb": 0, 00:13:22.337 "state": "configuring", 00:13:22.337 "raid_level": "raid1", 00:13:22.337 "superblock": true, 00:13:22.337 "num_base_bdevs": 3, 00:13:22.337 "num_base_bdevs_discovered": 1, 00:13:22.337 "num_base_bdevs_operational": 3, 00:13:22.337 "base_bdevs_list": [ 00:13:22.337 { 00:13:22.337 "name": null, 00:13:22.337 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:22.337 "is_configured": false, 00:13:22.337 "data_offset": 2048, 00:13:22.337 "data_size": 63488 00:13:22.337 }, 00:13:22.337 { 00:13:22.337 "name": null, 00:13:22.337 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:22.337 "is_configured": false, 00:13:22.337 "data_offset": 2048, 00:13:22.337 "data_size": 63488 00:13:22.337 }, 00:13:22.337 { 00:13:22.337 "name": "BaseBdev3", 00:13:22.337 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:22.337 "is_configured": true, 00:13:22.337 "data_offset": 2048, 00:13:22.337 "data_size": 63488 00:13:22.337 } 00:13:22.337 ] 00:13:22.337 }' 00:13:22.337 04:14:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:22.337 04:14:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:22.902 04:14:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.902 04:14:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:23.161 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:13:23.161 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:23.418 [2024-05-15 04:14:11.278411] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.418 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.676 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:23.676 "name": "Existed_Raid", 00:13:23.676 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:23.676 "strip_size_kb": 0, 00:13:23.676 "state": "configuring", 00:13:23.676 "raid_level": "raid1", 00:13:23.676 "superblock": true, 00:13:23.676 "num_base_bdevs": 3, 00:13:23.676 "num_base_bdevs_discovered": 2, 00:13:23.676 "num_base_bdevs_operational": 3, 00:13:23.676 "base_bdevs_list": [ 00:13:23.676 { 00:13:23.676 "name": null, 00:13:23.676 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:23.676 "is_configured": false, 00:13:23.676 "data_offset": 2048, 00:13:23.676 "data_size": 63488 00:13:23.676 }, 00:13:23.676 { 00:13:23.676 "name": "BaseBdev2", 00:13:23.676 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:23.676 "is_configured": true, 00:13:23.676 "data_offset": 2048, 00:13:23.676 "data_size": 63488 00:13:23.676 }, 00:13:23.676 { 00:13:23.676 "name": "BaseBdev3", 00:13:23.676 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:23.676 "is_configured": true, 00:13:23.676 "data_offset": 2048, 00:13:23.676 "data_size": 63488 00:13:23.676 } 00:13:23.676 ] 00:13:23.676 }' 00:13:23.676 04:14:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:23.676 04:14:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:24.242 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.242 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:24.499 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:13:24.499 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.499 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:24.757 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cedfbaef-81f2-4141-9fa0-e676f3fc7b5c 00:13:25.014 [2024-05-15 04:14:12.887953] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:25.014 [2024-05-15 04:14:12.888194] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x10e5b90 00:13:25.014 [2024-05-15 04:14:12.888210] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:25.014 [2024-05-15 04:14:12.888360] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf31ff0 00:13:25.014 [2024-05-15 04:14:12.888488] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10e5b90 00:13:25.014 [2024-05-15 04:14:12.888502] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10e5b90 00:13:25.014 [2024-05-15 04:14:12.888593] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.014 NewBaseBdev 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:25.014 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:25.015 04:14:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:25.272 04:14:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:25.531 [ 00:13:25.531 { 00:13:25.531 "name": "NewBaseBdev", 00:13:25.531 "aliases": [ 00:13:25.531 "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c" 00:13:25.531 ], 00:13:25.531 "product_name": "Malloc disk", 00:13:25.531 "block_size": 512, 00:13:25.531 "num_blocks": 65536, 00:13:25.531 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:25.531 "assigned_rate_limits": { 00:13:25.531 "rw_ios_per_sec": 0, 00:13:25.531 "rw_mbytes_per_sec": 0, 00:13:25.531 "r_mbytes_per_sec": 0, 00:13:25.531 "w_mbytes_per_sec": 0 00:13:25.531 }, 00:13:25.531 "claimed": true, 00:13:25.531 "claim_type": "exclusive_write", 00:13:25.531 "zoned": false, 00:13:25.531 "supported_io_types": { 00:13:25.531 "read": true, 00:13:25.531 "write": true, 00:13:25.531 "unmap": true, 00:13:25.531 "write_zeroes": true, 00:13:25.531 "flush": true, 00:13:25.531 "reset": true, 00:13:25.531 "compare": false, 00:13:25.531 "compare_and_write": false, 00:13:25.531 "abort": true, 00:13:25.531 "nvme_admin": false, 00:13:25.531 "nvme_io": false 00:13:25.531 }, 00:13:25.531 "memory_domains": [ 00:13:25.531 { 00:13:25.531 "dma_device_id": "system", 00:13:25.531 "dma_device_type": 1 00:13:25.531 }, 00:13:25.531 { 00:13:25.531 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.531 "dma_device_type": 2 00:13:25.531 } 00:13:25.531 ], 00:13:25.531 "driver_specific": {} 00:13:25.531 } 00:13:25.531 ] 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.531 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.789 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:25.789 "name": "Existed_Raid", 00:13:25.789 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:25.789 "strip_size_kb": 0, 00:13:25.789 "state": "online", 00:13:25.789 "raid_level": "raid1", 00:13:25.789 "superblock": true, 00:13:25.789 "num_base_bdevs": 3, 00:13:25.789 "num_base_bdevs_discovered": 3, 00:13:25.789 "num_base_bdevs_operational": 3, 00:13:25.789 "base_bdevs_list": [ 00:13:25.789 { 00:13:25.789 "name": "NewBaseBdev", 00:13:25.789 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:25.789 "is_configured": true, 00:13:25.789 "data_offset": 2048, 00:13:25.789 "data_size": 63488 00:13:25.789 }, 00:13:25.789 { 00:13:25.789 "name": "BaseBdev2", 00:13:25.789 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:25.789 "is_configured": true, 00:13:25.789 "data_offset": 2048, 00:13:25.789 "data_size": 63488 00:13:25.789 }, 00:13:25.789 { 00:13:25.789 "name": "BaseBdev3", 00:13:25.789 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:25.789 "is_configured": true, 00:13:25.789 "data_offset": 2048, 00:13:25.789 "data_size": 63488 00:13:25.789 } 00:13:25.789 ] 00:13:25.789 }' 00:13:25.789 04:14:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:25.789 04:14:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:26.353 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:26.611 [2024-05-15 04:14:14.480419] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:26.611 "name": "Existed_Raid", 00:13:26.611 "aliases": [ 00:13:26.611 "550438e7-6be0-4553-9eca-744e99ff6c0d" 00:13:26.611 ], 00:13:26.611 "product_name": "Raid Volume", 00:13:26.611 "block_size": 512, 00:13:26.611 "num_blocks": 63488, 00:13:26.611 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:26.611 "assigned_rate_limits": { 00:13:26.611 "rw_ios_per_sec": 0, 00:13:26.611 "rw_mbytes_per_sec": 0, 00:13:26.611 "r_mbytes_per_sec": 0, 00:13:26.611 "w_mbytes_per_sec": 0 00:13:26.611 }, 00:13:26.611 "claimed": false, 00:13:26.611 "zoned": false, 00:13:26.611 "supported_io_types": { 00:13:26.611 "read": true, 00:13:26.611 "write": true, 00:13:26.611 "unmap": false, 00:13:26.611 "write_zeroes": true, 00:13:26.611 "flush": false, 00:13:26.611 "reset": true, 00:13:26.611 "compare": false, 00:13:26.611 "compare_and_write": false, 00:13:26.611 "abort": false, 00:13:26.611 "nvme_admin": false, 00:13:26.611 "nvme_io": false 00:13:26.611 }, 00:13:26.611 "memory_domains": [ 00:13:26.611 { 00:13:26.611 "dma_device_id": "system", 00:13:26.611 "dma_device_type": 1 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.611 "dma_device_type": 2 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "dma_device_id": "system", 00:13:26.611 "dma_device_type": 1 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.611 "dma_device_type": 2 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "dma_device_id": "system", 00:13:26.611 "dma_device_type": 1 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.611 "dma_device_type": 2 00:13:26.611 } 00:13:26.611 ], 00:13:26.611 "driver_specific": { 00:13:26.611 "raid": { 00:13:26.611 "uuid": "550438e7-6be0-4553-9eca-744e99ff6c0d", 00:13:26.611 "strip_size_kb": 0, 00:13:26.611 "state": "online", 00:13:26.611 "raid_level": "raid1", 00:13:26.611 "superblock": true, 00:13:26.611 "num_base_bdevs": 3, 00:13:26.611 "num_base_bdevs_discovered": 3, 00:13:26.611 "num_base_bdevs_operational": 3, 00:13:26.611 "base_bdevs_list": [ 00:13:26.611 { 00:13:26.611 "name": "NewBaseBdev", 00:13:26.611 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:26.611 "is_configured": true, 00:13:26.611 "data_offset": 2048, 00:13:26.611 "data_size": 63488 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "name": "BaseBdev2", 00:13:26.611 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:26.611 "is_configured": true, 00:13:26.611 "data_offset": 2048, 00:13:26.611 "data_size": 63488 00:13:26.611 }, 00:13:26.611 { 00:13:26.611 "name": "BaseBdev3", 00:13:26.611 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:26.611 "is_configured": true, 00:13:26.611 "data_offset": 2048, 00:13:26.611 "data_size": 63488 00:13:26.611 } 00:13:26.611 ] 00:13:26.611 } 00:13:26.611 } 00:13:26.611 }' 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:13:26.611 BaseBdev2 00:13:26.611 BaseBdev3' 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:26.611 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:26.869 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:26.869 "name": "NewBaseBdev", 00:13:26.869 "aliases": [ 00:13:26.869 "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c" 00:13:26.869 ], 00:13:26.869 "product_name": "Malloc disk", 00:13:26.869 "block_size": 512, 00:13:26.869 "num_blocks": 65536, 00:13:26.869 "uuid": "cedfbaef-81f2-4141-9fa0-e676f3fc7b5c", 00:13:26.869 "assigned_rate_limits": { 00:13:26.869 "rw_ios_per_sec": 0, 00:13:26.869 "rw_mbytes_per_sec": 0, 00:13:26.869 "r_mbytes_per_sec": 0, 00:13:26.869 "w_mbytes_per_sec": 0 00:13:26.869 }, 00:13:26.869 "claimed": true, 00:13:26.869 "claim_type": "exclusive_write", 00:13:26.869 "zoned": false, 00:13:26.869 "supported_io_types": { 00:13:26.869 "read": true, 00:13:26.869 "write": true, 00:13:26.869 "unmap": true, 00:13:26.869 "write_zeroes": true, 00:13:26.869 "flush": true, 00:13:26.869 "reset": true, 00:13:26.869 "compare": false, 00:13:26.869 "compare_and_write": false, 00:13:26.869 "abort": true, 00:13:26.869 "nvme_admin": false, 00:13:26.869 "nvme_io": false 00:13:26.869 }, 00:13:26.869 "memory_domains": [ 00:13:26.869 { 00:13:26.869 "dma_device_id": "system", 00:13:26.869 "dma_device_type": 1 00:13:26.869 }, 00:13:26.869 { 00:13:26.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.869 "dma_device_type": 2 00:13:26.869 } 00:13:26.869 ], 00:13:26.869 "driver_specific": {} 00:13:26.869 }' 00:13:26.869 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:26.869 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:26.869 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:26.869 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:27.125 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:27.125 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.125 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:27.125 04:14:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:27.125 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:27.382 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:27.382 "name": "BaseBdev2", 00:13:27.382 "aliases": [ 00:13:27.382 "a2e41db0-adda-4199-be7c-ab1efb1e779b" 00:13:27.382 ], 00:13:27.382 "product_name": "Malloc disk", 00:13:27.382 "block_size": 512, 00:13:27.382 "num_blocks": 65536, 00:13:27.382 "uuid": "a2e41db0-adda-4199-be7c-ab1efb1e779b", 00:13:27.382 "assigned_rate_limits": { 00:13:27.382 "rw_ios_per_sec": 0, 00:13:27.382 "rw_mbytes_per_sec": 0, 00:13:27.382 "r_mbytes_per_sec": 0, 00:13:27.382 "w_mbytes_per_sec": 0 00:13:27.382 }, 00:13:27.382 "claimed": true, 00:13:27.382 "claim_type": "exclusive_write", 00:13:27.382 "zoned": false, 00:13:27.382 "supported_io_types": { 00:13:27.382 "read": true, 00:13:27.382 "write": true, 00:13:27.382 "unmap": true, 00:13:27.382 "write_zeroes": true, 00:13:27.382 "flush": true, 00:13:27.382 "reset": true, 00:13:27.382 "compare": false, 00:13:27.382 "compare_and_write": false, 00:13:27.382 "abort": true, 00:13:27.382 "nvme_admin": false, 00:13:27.382 "nvme_io": false 00:13:27.382 }, 00:13:27.382 "memory_domains": [ 00:13:27.382 { 00:13:27.382 "dma_device_id": "system", 00:13:27.382 "dma_device_type": 1 00:13:27.382 }, 00:13:27.382 { 00:13:27.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.382 "dma_device_type": 2 00:13:27.382 } 00:13:27.382 ], 00:13:27.382 "driver_specific": {} 00:13:27.382 }' 00:13:27.382 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:27.382 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:27.639 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:27.896 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:27.896 "name": "BaseBdev3", 00:13:27.896 "aliases": [ 00:13:27.896 "c3672ba5-a087-4676-829b-02cb99cedb82" 00:13:27.896 ], 00:13:27.896 "product_name": "Malloc disk", 00:13:27.896 "block_size": 512, 00:13:27.896 "num_blocks": 65536, 00:13:27.896 "uuid": "c3672ba5-a087-4676-829b-02cb99cedb82", 00:13:27.896 "assigned_rate_limits": { 00:13:27.896 "rw_ios_per_sec": 0, 00:13:27.896 "rw_mbytes_per_sec": 0, 00:13:27.896 "r_mbytes_per_sec": 0, 00:13:27.896 "w_mbytes_per_sec": 0 00:13:27.896 }, 00:13:27.896 "claimed": true, 00:13:27.896 "claim_type": "exclusive_write", 00:13:27.896 "zoned": false, 00:13:27.896 "supported_io_types": { 00:13:27.896 "read": true, 00:13:27.896 "write": true, 00:13:27.896 "unmap": true, 00:13:27.896 "write_zeroes": true, 00:13:27.896 "flush": true, 00:13:27.896 "reset": true, 00:13:27.896 "compare": false, 00:13:27.896 "compare_and_write": false, 00:13:27.896 "abort": true, 00:13:27.896 "nvme_admin": false, 00:13:27.896 "nvme_io": false 00:13:27.896 }, 00:13:27.896 "memory_domains": [ 00:13:27.896 { 00:13:27.896 "dma_device_id": "system", 00:13:27.896 "dma_device_type": 1 00:13:27.896 }, 00:13:27.896 { 00:13:27.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.896 "dma_device_type": 2 00:13:27.896 } 00:13:27.896 ], 00:13:27.896 "driver_specific": {} 00:13:27.896 }' 00:13:27.896 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:27.896 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:28.154 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:28.154 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:28.154 04:14:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:28.154 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:28.411 [2024-05-15 04:14:16.381308] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:28.411 [2024-05-15 04:14:16.381339] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.412 [2024-05-15 04:14:16.381413] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.412 [2024-05-15 04:14:16.381683] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.412 [2024-05-15 04:14:16.381700] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e5b90 name Existed_Raid, state offline 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3862350 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3862350 ']' 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3862350 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:28.412 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3862350 00:13:28.670 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:28.670 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:28.670 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3862350' 00:13:28.670 killing process with pid 3862350 00:13:28.670 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3862350 00:13:28.670 [2024-05-15 04:14:16.432664] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.670 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3862350 00:13:28.670 [2024-05-15 04:14:16.466663] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:28.927 04:14:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:13:28.927 00:13:28.927 real 0m28.037s 00:13:28.927 user 0m52.306s 00:13:28.927 sys 0m3.791s 00:13:28.927 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:28.927 04:14:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.927 ************************************ 00:13:28.927 END TEST raid_state_function_test_sb 00:13:28.927 ************************************ 00:13:28.927 04:14:16 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:13:28.927 04:14:16 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:13:28.927 04:14:16 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:28.927 04:14:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:28.927 ************************************ 00:13:28.927 START TEST raid_superblock_test 00:13:28.927 ************************************ 00:13:28.927 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 3 00:13:28.927 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=3 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3866273 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3866273 /var/tmp/spdk-raid.sock 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3866273 ']' 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:28.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:28.928 04:14:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.928 [2024-05-15 04:14:16.859729] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:13:28.928 [2024-05-15 04:14:16.859815] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3866273 ] 00:13:28.928 [2024-05-15 04:14:16.942782] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:29.185 [2024-05-15 04:14:17.066858] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:29.185 [2024-05-15 04:14:17.131317] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.185 [2024-05-15 04:14:17.131370] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:30.117 04:14:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:30.117 malloc1 00:13:30.117 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:30.374 [2024-05-15 04:14:18.293736] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:30.374 [2024-05-15 04:14:18.293801] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.374 [2024-05-15 04:14:18.293835] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a76c20 00:13:30.374 [2024-05-15 04:14:18.293853] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.374 [2024-05-15 04:14:18.295444] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.374 [2024-05-15 04:14:18.295473] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:30.374 pt1 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:30.374 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:30.631 malloc2 00:13:30.631 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:30.888 [2024-05-15 04:14:18.834490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:30.888 [2024-05-15 04:14:18.834572] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.888 [2024-05-15 04:14:18.834599] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a6ec00 00:13:30.888 [2024-05-15 04:14:18.834615] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.888 [2024-05-15 04:14:18.836576] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.888 [2024-05-15 04:14:18.836605] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:30.888 pt2 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:30.888 04:14:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:13:31.145 malloc3 00:13:31.145 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:31.402 [2024-05-15 04:14:19.375048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:31.402 [2024-05-15 04:14:19.375104] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:31.402 [2024-05-15 04:14:19.375135] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1f9c0 00:13:31.402 [2024-05-15 04:14:19.375151] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:31.403 [2024-05-15 04:14:19.376604] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:31.403 [2024-05-15 04:14:19.376633] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:31.403 pt3 00:13:31.403 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:13:31.403 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:13:31.403 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:13:31.660 [2024-05-15 04:14:19.671846] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:31.660 [2024-05-15 04:14:19.673012] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:31.660 [2024-05-15 04:14:19.673073] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:31.660 [2024-05-15 04:14:19.673260] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a728e0 00:13:31.660 [2024-05-15 04:14:19.673277] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:31.660 [2024-05-15 04:14:19.673466] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8db10 00:13:31.660 [2024-05-15 04:14:19.673638] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a728e0 00:13:31.660 [2024-05-15 04:14:19.673655] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a728e0 00:13:31.660 [2024-05-15 04:14:19.673760] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.917 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:32.175 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:32.175 "name": "raid_bdev1", 00:13:32.175 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:32.175 "strip_size_kb": 0, 00:13:32.175 "state": "online", 00:13:32.175 "raid_level": "raid1", 00:13:32.175 "superblock": true, 00:13:32.175 "num_base_bdevs": 3, 00:13:32.175 "num_base_bdevs_discovered": 3, 00:13:32.176 "num_base_bdevs_operational": 3, 00:13:32.176 "base_bdevs_list": [ 00:13:32.176 { 00:13:32.176 "name": "pt1", 00:13:32.176 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:32.176 "is_configured": true, 00:13:32.176 "data_offset": 2048, 00:13:32.176 "data_size": 63488 00:13:32.176 }, 00:13:32.176 { 00:13:32.176 "name": "pt2", 00:13:32.176 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:32.176 "is_configured": true, 00:13:32.176 "data_offset": 2048, 00:13:32.176 "data_size": 63488 00:13:32.176 }, 00:13:32.176 { 00:13:32.176 "name": "pt3", 00:13:32.176 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:32.176 "is_configured": true, 00:13:32.176 "data_offset": 2048, 00:13:32.176 "data_size": 63488 00:13:32.176 } 00:13:32.176 ] 00:13:32.176 }' 00:13:32.176 04:14:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:32.176 04:14:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:32.742 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:32.742 [2024-05-15 04:14:20.754932] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:33.036 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:33.036 "name": "raid_bdev1", 00:13:33.036 "aliases": [ 00:13:33.036 "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5" 00:13:33.036 ], 00:13:33.036 "product_name": "Raid Volume", 00:13:33.036 "block_size": 512, 00:13:33.037 "num_blocks": 63488, 00:13:33.037 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:33.037 "assigned_rate_limits": { 00:13:33.037 "rw_ios_per_sec": 0, 00:13:33.037 "rw_mbytes_per_sec": 0, 00:13:33.037 "r_mbytes_per_sec": 0, 00:13:33.037 "w_mbytes_per_sec": 0 00:13:33.037 }, 00:13:33.037 "claimed": false, 00:13:33.037 "zoned": false, 00:13:33.037 "supported_io_types": { 00:13:33.037 "read": true, 00:13:33.037 "write": true, 00:13:33.037 "unmap": false, 00:13:33.037 "write_zeroes": true, 00:13:33.037 "flush": false, 00:13:33.037 "reset": true, 00:13:33.037 "compare": false, 00:13:33.037 "compare_and_write": false, 00:13:33.037 "abort": false, 00:13:33.037 "nvme_admin": false, 00:13:33.037 "nvme_io": false 00:13:33.037 }, 00:13:33.037 "memory_domains": [ 00:13:33.037 { 00:13:33.037 "dma_device_id": "system", 00:13:33.037 "dma_device_type": 1 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.037 "dma_device_type": 2 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "dma_device_id": "system", 00:13:33.037 "dma_device_type": 1 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.037 "dma_device_type": 2 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "dma_device_id": "system", 00:13:33.037 "dma_device_type": 1 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.037 "dma_device_type": 2 00:13:33.037 } 00:13:33.037 ], 00:13:33.037 "driver_specific": { 00:13:33.037 "raid": { 00:13:33.037 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:33.037 "strip_size_kb": 0, 00:13:33.037 "state": "online", 00:13:33.037 "raid_level": "raid1", 00:13:33.037 "superblock": true, 00:13:33.037 "num_base_bdevs": 3, 00:13:33.037 "num_base_bdevs_discovered": 3, 00:13:33.037 "num_base_bdevs_operational": 3, 00:13:33.037 "base_bdevs_list": [ 00:13:33.037 { 00:13:33.037 "name": "pt1", 00:13:33.037 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:33.037 "is_configured": true, 00:13:33.037 "data_offset": 2048, 00:13:33.037 "data_size": 63488 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "name": "pt2", 00:13:33.037 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:33.037 "is_configured": true, 00:13:33.037 "data_offset": 2048, 00:13:33.037 "data_size": 63488 00:13:33.037 }, 00:13:33.037 { 00:13:33.037 "name": "pt3", 00:13:33.037 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:33.037 "is_configured": true, 00:13:33.037 "data_offset": 2048, 00:13:33.037 "data_size": 63488 00:13:33.037 } 00:13:33.037 ] 00:13:33.037 } 00:13:33.037 } 00:13:33.037 }' 00:13:33.037 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:33.037 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:33.037 pt2 00:13:33.037 pt3' 00:13:33.037 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:33.037 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:33.037 04:14:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:33.321 "name": "pt1", 00:13:33.321 "aliases": [ 00:13:33.321 "8eb0f9c1-618c-548d-a5d6-7b2784936a6b" 00:13:33.321 ], 00:13:33.321 "product_name": "passthru", 00:13:33.321 "block_size": 512, 00:13:33.321 "num_blocks": 65536, 00:13:33.321 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:33.321 "assigned_rate_limits": { 00:13:33.321 "rw_ios_per_sec": 0, 00:13:33.321 "rw_mbytes_per_sec": 0, 00:13:33.321 "r_mbytes_per_sec": 0, 00:13:33.321 "w_mbytes_per_sec": 0 00:13:33.321 }, 00:13:33.321 "claimed": true, 00:13:33.321 "claim_type": "exclusive_write", 00:13:33.321 "zoned": false, 00:13:33.321 "supported_io_types": { 00:13:33.321 "read": true, 00:13:33.321 "write": true, 00:13:33.321 "unmap": true, 00:13:33.321 "write_zeroes": true, 00:13:33.321 "flush": true, 00:13:33.321 "reset": true, 00:13:33.321 "compare": false, 00:13:33.321 "compare_and_write": false, 00:13:33.321 "abort": true, 00:13:33.321 "nvme_admin": false, 00:13:33.321 "nvme_io": false 00:13:33.321 }, 00:13:33.321 "memory_domains": [ 00:13:33.321 { 00:13:33.321 "dma_device_id": "system", 00:13:33.321 "dma_device_type": 1 00:13:33.321 }, 00:13:33.321 { 00:13:33.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.321 "dma_device_type": 2 00:13:33.321 } 00:13:33.321 ], 00:13:33.321 "driver_specific": { 00:13:33.321 "passthru": { 00:13:33.321 "name": "pt1", 00:13:33.321 "base_bdev_name": "malloc1" 00:13:33.321 } 00:13:33.321 } 00:13:33.321 }' 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:33.321 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:33.580 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:33.837 "name": "pt2", 00:13:33.837 "aliases": [ 00:13:33.837 "39e4c2d1-37eb-5342-bfc5-c595e17912ae" 00:13:33.837 ], 00:13:33.837 "product_name": "passthru", 00:13:33.837 "block_size": 512, 00:13:33.837 "num_blocks": 65536, 00:13:33.837 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:33.837 "assigned_rate_limits": { 00:13:33.837 "rw_ios_per_sec": 0, 00:13:33.837 "rw_mbytes_per_sec": 0, 00:13:33.837 "r_mbytes_per_sec": 0, 00:13:33.837 "w_mbytes_per_sec": 0 00:13:33.837 }, 00:13:33.837 "claimed": true, 00:13:33.837 "claim_type": "exclusive_write", 00:13:33.837 "zoned": false, 00:13:33.837 "supported_io_types": { 00:13:33.837 "read": true, 00:13:33.837 "write": true, 00:13:33.837 "unmap": true, 00:13:33.837 "write_zeroes": true, 00:13:33.837 "flush": true, 00:13:33.837 "reset": true, 00:13:33.837 "compare": false, 00:13:33.837 "compare_and_write": false, 00:13:33.837 "abort": true, 00:13:33.837 "nvme_admin": false, 00:13:33.837 "nvme_io": false 00:13:33.837 }, 00:13:33.837 "memory_domains": [ 00:13:33.837 { 00:13:33.837 "dma_device_id": "system", 00:13:33.837 "dma_device_type": 1 00:13:33.837 }, 00:13:33.837 { 00:13:33.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.837 "dma_device_type": 2 00:13:33.837 } 00:13:33.837 ], 00:13:33.837 "driver_specific": { 00:13:33.837 "passthru": { 00:13:33.837 "name": "pt2", 00:13:33.837 "base_bdev_name": "malloc2" 00:13:33.837 } 00:13:33.837 } 00:13:33.837 }' 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:33.837 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:34.093 04:14:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:34.351 "name": "pt3", 00:13:34.351 "aliases": [ 00:13:34.351 "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe" 00:13:34.351 ], 00:13:34.351 "product_name": "passthru", 00:13:34.351 "block_size": 512, 00:13:34.351 "num_blocks": 65536, 00:13:34.351 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:34.351 "assigned_rate_limits": { 00:13:34.351 "rw_ios_per_sec": 0, 00:13:34.351 "rw_mbytes_per_sec": 0, 00:13:34.351 "r_mbytes_per_sec": 0, 00:13:34.351 "w_mbytes_per_sec": 0 00:13:34.351 }, 00:13:34.351 "claimed": true, 00:13:34.351 "claim_type": "exclusive_write", 00:13:34.351 "zoned": false, 00:13:34.351 "supported_io_types": { 00:13:34.351 "read": true, 00:13:34.351 "write": true, 00:13:34.351 "unmap": true, 00:13:34.351 "write_zeroes": true, 00:13:34.351 "flush": true, 00:13:34.351 "reset": true, 00:13:34.351 "compare": false, 00:13:34.351 "compare_and_write": false, 00:13:34.351 "abort": true, 00:13:34.351 "nvme_admin": false, 00:13:34.351 "nvme_io": false 00:13:34.351 }, 00:13:34.351 "memory_domains": [ 00:13:34.351 { 00:13:34.351 "dma_device_id": "system", 00:13:34.351 "dma_device_type": 1 00:13:34.351 }, 00:13:34.351 { 00:13:34.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:34.351 "dma_device_type": 2 00:13:34.351 } 00:13:34.351 ], 00:13:34.351 "driver_specific": { 00:13:34.351 "passthru": { 00:13:34.351 "name": "pt3", 00:13:34.351 "base_bdev_name": "malloc3" 00:13:34.351 } 00:13:34.351 } 00:13:34.351 }' 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:34.351 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:34.609 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:13:34.867 [2024-05-15 04:14:22.736256] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:34.867 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 00:13:34.867 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 ']' 00:13:34.867 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:35.126 [2024-05-15 04:14:22.984650] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:35.126 [2024-05-15 04:14:22.984684] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:35.126 [2024-05-15 04:14:22.984771] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:35.126 [2024-05-15 04:14:22.984878] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:35.126 [2024-05-15 04:14:22.984895] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a728e0 name raid_bdev1, state offline 00:13:35.126 04:14:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.126 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:13:35.383 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:13:35.383 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:13:35.383 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:35.383 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:35.642 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:35.642 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:35.900 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:13:35.900 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:36.158 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:36.158 04:14:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:36.416 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:36.674 [2024-05-15 04:14:24.468893] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:36.674 [2024-05-15 04:14:24.470294] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:36.674 [2024-05-15 04:14:24.470340] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:36.674 [2024-05-15 04:14:24.470397] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:36.674 [2024-05-15 04:14:24.470451] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:36.674 [2024-05-15 04:14:24.470477] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:36.674 [2024-05-15 04:14:24.470498] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:36.674 [2024-05-15 04:14:24.470509] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a70530 name raid_bdev1, state configuring 00:13:36.674 request: 00:13:36.674 { 00:13:36.674 "name": "raid_bdev1", 00:13:36.674 "raid_level": "raid1", 00:13:36.674 "base_bdevs": [ 00:13:36.674 "malloc1", 00:13:36.674 "malloc2", 00:13:36.674 "malloc3" 00:13:36.674 ], 00:13:36.674 "superblock": false, 00:13:36.674 "method": "bdev_raid_create", 00:13:36.674 "req_id": 1 00:13:36.674 } 00:13:36.674 Got JSON-RPC error response 00:13:36.674 response: 00:13:36.674 { 00:13:36.674 "code": -17, 00:13:36.674 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:36.674 } 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.674 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:13:36.932 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:13:36.932 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:13:36.932 04:14:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:37.190 [2024-05-15 04:14:24.986150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:37.190 [2024-05-15 04:14:24.986224] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:37.190 [2024-05-15 04:14:24.986253] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c29dd0 00:13:37.190 [2024-05-15 04:14:24.986268] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:37.190 [2024-05-15 04:14:24.988015] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:37.190 [2024-05-15 04:14:24.988044] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:37.190 [2024-05-15 04:14:24.988137] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:37.190 [2024-05-15 04:14:24.988188] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:37.190 pt1 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.190 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:37.449 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:37.449 "name": "raid_bdev1", 00:13:37.449 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:37.449 "strip_size_kb": 0, 00:13:37.449 "state": "configuring", 00:13:37.449 "raid_level": "raid1", 00:13:37.449 "superblock": true, 00:13:37.449 "num_base_bdevs": 3, 00:13:37.449 "num_base_bdevs_discovered": 1, 00:13:37.449 "num_base_bdevs_operational": 3, 00:13:37.449 "base_bdevs_list": [ 00:13:37.449 { 00:13:37.449 "name": "pt1", 00:13:37.449 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:37.449 "is_configured": true, 00:13:37.449 "data_offset": 2048, 00:13:37.449 "data_size": 63488 00:13:37.449 }, 00:13:37.449 { 00:13:37.449 "name": null, 00:13:37.449 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:37.449 "is_configured": false, 00:13:37.449 "data_offset": 2048, 00:13:37.449 "data_size": 63488 00:13:37.449 }, 00:13:37.449 { 00:13:37.449 "name": null, 00:13:37.449 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:37.449 "is_configured": false, 00:13:37.449 "data_offset": 2048, 00:13:37.449 "data_size": 63488 00:13:37.449 } 00:13:37.449 ] 00:13:37.449 }' 00:13:37.449 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:37.449 04:14:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.015 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 3 -gt 2 ']' 00:13:38.015 04:14:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:38.015 [2024-05-15 04:14:26.004876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:38.015 [2024-05-15 04:14:26.004942] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:38.015 [2024-05-15 04:14:26.004967] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a717d0 00:13:38.015 [2024-05-15 04:14:26.004980] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:38.015 [2024-05-15 04:14:26.005363] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:38.015 [2024-05-15 04:14:26.005385] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:38.015 [2024-05-15 04:14:26.005461] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:38.015 [2024-05-15 04:14:26.005486] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:38.015 pt2 00:13:38.015 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:38.581 [2024-05-15 04:14:26.293689] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:38.581 "name": "raid_bdev1", 00:13:38.581 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:38.581 "strip_size_kb": 0, 00:13:38.581 "state": "configuring", 00:13:38.581 "raid_level": "raid1", 00:13:38.581 "superblock": true, 00:13:38.581 "num_base_bdevs": 3, 00:13:38.581 "num_base_bdevs_discovered": 1, 00:13:38.581 "num_base_bdevs_operational": 3, 00:13:38.581 "base_bdevs_list": [ 00:13:38.581 { 00:13:38.581 "name": "pt1", 00:13:38.581 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:38.581 "is_configured": true, 00:13:38.581 "data_offset": 2048, 00:13:38.581 "data_size": 63488 00:13:38.581 }, 00:13:38.581 { 00:13:38.581 "name": null, 00:13:38.581 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:38.581 "is_configured": false, 00:13:38.581 "data_offset": 2048, 00:13:38.581 "data_size": 63488 00:13:38.581 }, 00:13:38.581 { 00:13:38.581 "name": null, 00:13:38.581 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:38.581 "is_configured": false, 00:13:38.581 "data_offset": 2048, 00:13:38.581 "data_size": 63488 00:13:38.581 } 00:13:38.581 ] 00:13:38.581 }' 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:38.581 04:14:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.147 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:13:39.147 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:39.147 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:39.405 [2024-05-15 04:14:27.340455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:39.405 [2024-05-15 04:14:27.340526] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.405 [2024-05-15 04:14:27.340552] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a6ee30 00:13:39.405 [2024-05-15 04:14:27.340566] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.405 [2024-05-15 04:14:27.340999] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.405 [2024-05-15 04:14:27.341025] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:39.405 [2024-05-15 04:14:27.341116] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:39.405 [2024-05-15 04:14:27.341160] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:39.405 pt2 00:13:39.405 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:39.405 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:39.405 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:39.663 [2024-05-15 04:14:27.633229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:39.663 [2024-05-15 04:14:27.633286] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.663 [2024-05-15 04:14:27.633315] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a71510 00:13:39.663 [2024-05-15 04:14:27.633328] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.663 [2024-05-15 04:14:27.633668] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.663 [2024-05-15 04:14:27.633692] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:39.663 [2024-05-15 04:14:27.633764] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:39.663 [2024-05-15 04:14:27.633788] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:39.663 [2024-05-15 04:14:27.633931] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a71af0 00:13:39.663 [2024-05-15 04:14:27.633945] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:39.663 [2024-05-15 04:14:27.634090] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a75810 00:13:39.663 [2024-05-15 04:14:27.634235] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a71af0 00:13:39.663 [2024-05-15 04:14:27.634249] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a71af0 00:13:39.663 [2024-05-15 04:14:27.634343] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.663 pt3 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.663 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:39.921 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:39.921 "name": "raid_bdev1", 00:13:39.921 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:39.921 "strip_size_kb": 0, 00:13:39.921 "state": "online", 00:13:39.921 "raid_level": "raid1", 00:13:39.921 "superblock": true, 00:13:39.921 "num_base_bdevs": 3, 00:13:39.921 "num_base_bdevs_discovered": 3, 00:13:39.921 "num_base_bdevs_operational": 3, 00:13:39.921 "base_bdevs_list": [ 00:13:39.921 { 00:13:39.921 "name": "pt1", 00:13:39.921 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:39.921 "is_configured": true, 00:13:39.921 "data_offset": 2048, 00:13:39.921 "data_size": 63488 00:13:39.921 }, 00:13:39.921 { 00:13:39.921 "name": "pt2", 00:13:39.921 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:39.921 "is_configured": true, 00:13:39.921 "data_offset": 2048, 00:13:39.921 "data_size": 63488 00:13:39.921 }, 00:13:39.921 { 00:13:39.921 "name": "pt3", 00:13:39.921 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:39.921 "is_configured": true, 00:13:39.921 "data_offset": 2048, 00:13:39.921 "data_size": 63488 00:13:39.921 } 00:13:39.921 ] 00:13:39.921 }' 00:13:39.921 04:14:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:39.921 04:14:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.485 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:13:40.485 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:40.486 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:13:40.743 [2024-05-15 04:14:28.668155] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:40.743 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:13:40.743 "name": "raid_bdev1", 00:13:40.743 "aliases": [ 00:13:40.743 "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5" 00:13:40.743 ], 00:13:40.743 "product_name": "Raid Volume", 00:13:40.743 "block_size": 512, 00:13:40.743 "num_blocks": 63488, 00:13:40.743 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:40.743 "assigned_rate_limits": { 00:13:40.743 "rw_ios_per_sec": 0, 00:13:40.743 "rw_mbytes_per_sec": 0, 00:13:40.743 "r_mbytes_per_sec": 0, 00:13:40.743 "w_mbytes_per_sec": 0 00:13:40.743 }, 00:13:40.743 "claimed": false, 00:13:40.743 "zoned": false, 00:13:40.743 "supported_io_types": { 00:13:40.743 "read": true, 00:13:40.744 "write": true, 00:13:40.744 "unmap": false, 00:13:40.744 "write_zeroes": true, 00:13:40.744 "flush": false, 00:13:40.744 "reset": true, 00:13:40.744 "compare": false, 00:13:40.744 "compare_and_write": false, 00:13:40.744 "abort": false, 00:13:40.744 "nvme_admin": false, 00:13:40.744 "nvme_io": false 00:13:40.744 }, 00:13:40.744 "memory_domains": [ 00:13:40.744 { 00:13:40.744 "dma_device_id": "system", 00:13:40.744 "dma_device_type": 1 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.744 "dma_device_type": 2 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "dma_device_id": "system", 00:13:40.744 "dma_device_type": 1 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.744 "dma_device_type": 2 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "dma_device_id": "system", 00:13:40.744 "dma_device_type": 1 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.744 "dma_device_type": 2 00:13:40.744 } 00:13:40.744 ], 00:13:40.744 "driver_specific": { 00:13:40.744 "raid": { 00:13:40.744 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:40.744 "strip_size_kb": 0, 00:13:40.744 "state": "online", 00:13:40.744 "raid_level": "raid1", 00:13:40.744 "superblock": true, 00:13:40.744 "num_base_bdevs": 3, 00:13:40.744 "num_base_bdevs_discovered": 3, 00:13:40.744 "num_base_bdevs_operational": 3, 00:13:40.744 "base_bdevs_list": [ 00:13:40.744 { 00:13:40.744 "name": "pt1", 00:13:40.744 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:40.744 "is_configured": true, 00:13:40.744 "data_offset": 2048, 00:13:40.744 "data_size": 63488 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "name": "pt2", 00:13:40.744 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:40.744 "is_configured": true, 00:13:40.744 "data_offset": 2048, 00:13:40.744 "data_size": 63488 00:13:40.744 }, 00:13:40.744 { 00:13:40.744 "name": "pt3", 00:13:40.744 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:40.744 "is_configured": true, 00:13:40.744 "data_offset": 2048, 00:13:40.744 "data_size": 63488 00:13:40.744 } 00:13:40.744 ] 00:13:40.744 } 00:13:40.744 } 00:13:40.744 }' 00:13:40.744 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:40.744 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:13:40.744 pt2 00:13:40.744 pt3' 00:13:40.744 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:40.744 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:40.744 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:41.002 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:41.002 "name": "pt1", 00:13:41.002 "aliases": [ 00:13:41.002 "8eb0f9c1-618c-548d-a5d6-7b2784936a6b" 00:13:41.002 ], 00:13:41.002 "product_name": "passthru", 00:13:41.002 "block_size": 512, 00:13:41.002 "num_blocks": 65536, 00:13:41.002 "uuid": "8eb0f9c1-618c-548d-a5d6-7b2784936a6b", 00:13:41.002 "assigned_rate_limits": { 00:13:41.002 "rw_ios_per_sec": 0, 00:13:41.002 "rw_mbytes_per_sec": 0, 00:13:41.002 "r_mbytes_per_sec": 0, 00:13:41.002 "w_mbytes_per_sec": 0 00:13:41.002 }, 00:13:41.002 "claimed": true, 00:13:41.002 "claim_type": "exclusive_write", 00:13:41.002 "zoned": false, 00:13:41.002 "supported_io_types": { 00:13:41.002 "read": true, 00:13:41.002 "write": true, 00:13:41.002 "unmap": true, 00:13:41.002 "write_zeroes": true, 00:13:41.002 "flush": true, 00:13:41.002 "reset": true, 00:13:41.002 "compare": false, 00:13:41.002 "compare_and_write": false, 00:13:41.002 "abort": true, 00:13:41.002 "nvme_admin": false, 00:13:41.002 "nvme_io": false 00:13:41.002 }, 00:13:41.002 "memory_domains": [ 00:13:41.002 { 00:13:41.002 "dma_device_id": "system", 00:13:41.002 "dma_device_type": 1 00:13:41.002 }, 00:13:41.002 { 00:13:41.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.002 "dma_device_type": 2 00:13:41.002 } 00:13:41.002 ], 00:13:41.002 "driver_specific": { 00:13:41.002 "passthru": { 00:13:41.002 "name": "pt1", 00:13:41.002 "base_bdev_name": "malloc1" 00:13:41.002 } 00:13:41.002 } 00:13:41.002 }' 00:13:41.002 04:14:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.002 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:41.260 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:41.518 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:41.519 "name": "pt2", 00:13:41.519 "aliases": [ 00:13:41.519 "39e4c2d1-37eb-5342-bfc5-c595e17912ae" 00:13:41.519 ], 00:13:41.519 "product_name": "passthru", 00:13:41.519 "block_size": 512, 00:13:41.519 "num_blocks": 65536, 00:13:41.519 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:41.519 "assigned_rate_limits": { 00:13:41.519 "rw_ios_per_sec": 0, 00:13:41.519 "rw_mbytes_per_sec": 0, 00:13:41.519 "r_mbytes_per_sec": 0, 00:13:41.519 "w_mbytes_per_sec": 0 00:13:41.519 }, 00:13:41.519 "claimed": true, 00:13:41.519 "claim_type": "exclusive_write", 00:13:41.519 "zoned": false, 00:13:41.519 "supported_io_types": { 00:13:41.519 "read": true, 00:13:41.519 "write": true, 00:13:41.519 "unmap": true, 00:13:41.519 "write_zeroes": true, 00:13:41.519 "flush": true, 00:13:41.519 "reset": true, 00:13:41.519 "compare": false, 00:13:41.519 "compare_and_write": false, 00:13:41.519 "abort": true, 00:13:41.519 "nvme_admin": false, 00:13:41.519 "nvme_io": false 00:13:41.519 }, 00:13:41.519 "memory_domains": [ 00:13:41.519 { 00:13:41.519 "dma_device_id": "system", 00:13:41.519 "dma_device_type": 1 00:13:41.519 }, 00:13:41.519 { 00:13:41.519 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.519 "dma_device_type": 2 00:13:41.519 } 00:13:41.519 ], 00:13:41.519 "driver_specific": { 00:13:41.519 "passthru": { 00:13:41.519 "name": "pt2", 00:13:41.519 "base_bdev_name": "malloc2" 00:13:41.519 } 00:13:41.519 } 00:13:41.519 }' 00:13:41.519 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:41.777 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:42.035 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:42.035 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:13:42.035 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:42.035 04:14:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:13:42.293 "name": "pt3", 00:13:42.293 "aliases": [ 00:13:42.293 "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe" 00:13:42.293 ], 00:13:42.293 "product_name": "passthru", 00:13:42.293 "block_size": 512, 00:13:42.293 "num_blocks": 65536, 00:13:42.293 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:42.293 "assigned_rate_limits": { 00:13:42.293 "rw_ios_per_sec": 0, 00:13:42.293 "rw_mbytes_per_sec": 0, 00:13:42.293 "r_mbytes_per_sec": 0, 00:13:42.293 "w_mbytes_per_sec": 0 00:13:42.293 }, 00:13:42.293 "claimed": true, 00:13:42.293 "claim_type": "exclusive_write", 00:13:42.293 "zoned": false, 00:13:42.293 "supported_io_types": { 00:13:42.293 "read": true, 00:13:42.293 "write": true, 00:13:42.293 "unmap": true, 00:13:42.293 "write_zeroes": true, 00:13:42.293 "flush": true, 00:13:42.293 "reset": true, 00:13:42.293 "compare": false, 00:13:42.293 "compare_and_write": false, 00:13:42.293 "abort": true, 00:13:42.293 "nvme_admin": false, 00:13:42.293 "nvme_io": false 00:13:42.293 }, 00:13:42.293 "memory_domains": [ 00:13:42.293 { 00:13:42.293 "dma_device_id": "system", 00:13:42.293 "dma_device_type": 1 00:13:42.293 }, 00:13:42.293 { 00:13:42.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.293 "dma_device_type": 2 00:13:42.293 } 00:13:42.293 ], 00:13:42.293 "driver_specific": { 00:13:42.293 "passthru": { 00:13:42.293 "name": "pt3", 00:13:42.293 "base_bdev_name": "malloc3" 00:13:42.293 } 00:13:42.293 } 00:13:42.293 }' 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:42.293 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:42.551 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:13:42.809 [2024-05-15 04:14:30.633537] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:42.809 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 '!=' f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 ']' 00:13:42.809 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:13:42.809 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:13:42.809 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:13:42.809 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:43.067 [2024-05-15 04:14:30.922094] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.067 04:14:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:43.325 04:14:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:43.325 "name": "raid_bdev1", 00:13:43.325 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:43.325 "strip_size_kb": 0, 00:13:43.325 "state": "online", 00:13:43.325 "raid_level": "raid1", 00:13:43.325 "superblock": true, 00:13:43.325 "num_base_bdevs": 3, 00:13:43.325 "num_base_bdevs_discovered": 2, 00:13:43.325 "num_base_bdevs_operational": 2, 00:13:43.325 "base_bdevs_list": [ 00:13:43.325 { 00:13:43.325 "name": null, 00:13:43.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:43.325 "is_configured": false, 00:13:43.325 "data_offset": 2048, 00:13:43.325 "data_size": 63488 00:13:43.325 }, 00:13:43.325 { 00:13:43.325 "name": "pt2", 00:13:43.325 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:43.325 "is_configured": true, 00:13:43.325 "data_offset": 2048, 00:13:43.325 "data_size": 63488 00:13:43.325 }, 00:13:43.325 { 00:13:43.325 "name": "pt3", 00:13:43.325 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:43.325 "is_configured": true, 00:13:43.325 "data_offset": 2048, 00:13:43.325 "data_size": 63488 00:13:43.325 } 00:13:43.325 ] 00:13:43.325 }' 00:13:43.325 04:14:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:43.325 04:14:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.891 04:14:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:44.148 [2024-05-15 04:14:32.000894] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:44.148 [2024-05-15 04:14:32.000926] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:44.148 [2024-05-15 04:14:32.000998] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:44.148 [2024-05-15 04:14:32.001061] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:44.148 [2024-05-15 04:14:32.001075] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a71af0 name raid_bdev1, state offline 00:13:44.148 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.148 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:13:44.406 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:13:44.406 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:13:44.407 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:13:44.407 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:13:44.407 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:44.665 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:13:44.665 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:13:44.665 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:44.922 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:13:44.922 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:13:44.922 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:13:44.922 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:13:44.922 04:14:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:45.180 [2024-05-15 04:14:33.107810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:45.180 [2024-05-15 04:14:33.107892] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:45.180 [2024-05-15 04:14:33.107924] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a71370 00:13:45.180 [2024-05-15 04:14:33.107947] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:45.180 [2024-05-15 04:14:33.109784] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:45.180 [2024-05-15 04:14:33.109815] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:45.180 [2024-05-15 04:14:33.109927] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:45.180 [2024-05-15 04:14:33.109974] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:45.180 pt2 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.180 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:45.437 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:45.437 "name": "raid_bdev1", 00:13:45.437 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:45.437 "strip_size_kb": 0, 00:13:45.437 "state": "configuring", 00:13:45.437 "raid_level": "raid1", 00:13:45.437 "superblock": true, 00:13:45.437 "num_base_bdevs": 3, 00:13:45.437 "num_base_bdevs_discovered": 1, 00:13:45.437 "num_base_bdevs_operational": 2, 00:13:45.437 "base_bdevs_list": [ 00:13:45.437 { 00:13:45.437 "name": null, 00:13:45.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.437 "is_configured": false, 00:13:45.437 "data_offset": 2048, 00:13:45.437 "data_size": 63488 00:13:45.437 }, 00:13:45.437 { 00:13:45.437 "name": "pt2", 00:13:45.437 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:45.437 "is_configured": true, 00:13:45.437 "data_offset": 2048, 00:13:45.437 "data_size": 63488 00:13:45.437 }, 00:13:45.437 { 00:13:45.437 "name": null, 00:13:45.437 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:45.437 "is_configured": false, 00:13:45.437 "data_offset": 2048, 00:13:45.437 "data_size": 63488 00:13:45.437 } 00:13:45.437 ] 00:13:45.437 }' 00:13:45.437 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:45.437 04:14:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.000 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:13:46.000 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:13:46.000 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=2 00:13:46.000 04:14:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:46.256 [2024-05-15 04:14:34.206910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:46.256 [2024-05-15 04:14:34.206981] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.256 [2024-05-15 04:14:34.207024] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a77d70 00:13:46.256 [2024-05-15 04:14:34.207042] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.256 [2024-05-15 04:14:34.207466] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.256 [2024-05-15 04:14:34.207493] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:46.256 [2024-05-15 04:14:34.207583] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:46.256 [2024-05-15 04:14:34.207613] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:46.256 [2024-05-15 04:14:34.207741] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a78090 00:13:46.256 [2024-05-15 04:14:34.207758] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:46.256 [2024-05-15 04:14:34.207939] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a74ef0 00:13:46.256 [2024-05-15 04:14:34.208100] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a78090 00:13:46.256 [2024-05-15 04:14:34.208117] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a78090 00:13:46.256 [2024-05-15 04:14:34.208232] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.256 pt3 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.256 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:46.512 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:46.512 "name": "raid_bdev1", 00:13:46.512 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:46.512 "strip_size_kb": 0, 00:13:46.512 "state": "online", 00:13:46.512 "raid_level": "raid1", 00:13:46.512 "superblock": true, 00:13:46.512 "num_base_bdevs": 3, 00:13:46.512 "num_base_bdevs_discovered": 2, 00:13:46.512 "num_base_bdevs_operational": 2, 00:13:46.512 "base_bdevs_list": [ 00:13:46.512 { 00:13:46.512 "name": null, 00:13:46.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:46.512 "is_configured": false, 00:13:46.512 "data_offset": 2048, 00:13:46.512 "data_size": 63488 00:13:46.512 }, 00:13:46.512 { 00:13:46.512 "name": "pt2", 00:13:46.512 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:46.512 "is_configured": true, 00:13:46.512 "data_offset": 2048, 00:13:46.512 "data_size": 63488 00:13:46.512 }, 00:13:46.512 { 00:13:46.512 "name": "pt3", 00:13:46.512 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:46.512 "is_configured": true, 00:13:46.512 "data_offset": 2048, 00:13:46.512 "data_size": 63488 00:13:46.512 } 00:13:46.512 ] 00:13:46.512 }' 00:13:46.512 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:46.512 04:14:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:47.076 04:14:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:47.333 [2024-05-15 04:14:35.217572] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:47.333 [2024-05-15 04:14:35.217605] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:47.333 [2024-05-15 04:14:35.217693] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:47.333 [2024-05-15 04:14:35.217762] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:47.333 [2024-05-15 04:14:35.217776] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a78090 name raid_bdev1, state offline 00:13:47.333 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.333 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:13:47.590 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:13:47.590 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:13:47.590 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@532 -- # '[' 3 -gt 2 ']' 00:13:47.590 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=2 00:13:47.590 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:47.846 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:48.104 [2024-05-15 04:14:35.975537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:48.104 [2024-05-15 04:14:35.975596] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:48.104 [2024-05-15 04:14:35.975624] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a73510 00:13:48.104 [2024-05-15 04:14:35.975640] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:48.104 [2024-05-15 04:14:35.977378] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:48.104 [2024-05-15 04:14:35.977408] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:48.104 [2024-05-15 04:14:35.977496] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:48.104 [2024-05-15 04:14:35.977539] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:48.104 [2024-05-15 04:14:35.977664] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:48.104 [2024-05-15 04:14:35.977683] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:48.104 [2024-05-15 04:14:35.977700] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a75cd0 name raid_bdev1, state configuring 00:13:48.104 [2024-05-15 04:14:35.977730] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:48.104 pt1 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # '[' 3 -gt 2 ']' 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:48.104 04:14:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:48.104 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.104 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:48.361 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:48.361 "name": "raid_bdev1", 00:13:48.361 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:48.361 "strip_size_kb": 0, 00:13:48.361 "state": "configuring", 00:13:48.361 "raid_level": "raid1", 00:13:48.361 "superblock": true, 00:13:48.361 "num_base_bdevs": 3, 00:13:48.361 "num_base_bdevs_discovered": 1, 00:13:48.361 "num_base_bdevs_operational": 2, 00:13:48.361 "base_bdevs_list": [ 00:13:48.361 { 00:13:48.361 "name": null, 00:13:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.361 "is_configured": false, 00:13:48.361 "data_offset": 2048, 00:13:48.361 "data_size": 63488 00:13:48.361 }, 00:13:48.361 { 00:13:48.361 "name": "pt2", 00:13:48.361 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:48.361 "is_configured": true, 00:13:48.361 "data_offset": 2048, 00:13:48.361 "data_size": 63488 00:13:48.361 }, 00:13:48.361 { 00:13:48.361 "name": null, 00:13:48.361 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:48.361 "is_configured": false, 00:13:48.361 "data_offset": 2048, 00:13:48.361 "data_size": 63488 00:13:48.361 } 00:13:48.361 ] 00:13:48.361 }' 00:13:48.361 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:48.361 04:14:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.926 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:13:48.926 04:14:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:49.184 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # [[ false == \f\a\l\s\e ]] 00:13:49.184 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:49.442 [2024-05-15 04:14:37.311096] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:49.442 [2024-05-15 04:14:37.311171] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:49.442 [2024-05-15 04:14:37.311202] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a77d70 00:13:49.442 [2024-05-15 04:14:37.311227] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:49.442 [2024-05-15 04:14:37.311694] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:49.442 [2024-05-15 04:14:37.311722] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:49.442 [2024-05-15 04:14:37.311818] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:49.442 [2024-05-15 04:14:37.311865] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:49.442 [2024-05-15 04:14:37.312002] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a759b0 00:13:49.442 [2024-05-15 04:14:37.312019] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:49.442 [2024-05-15 04:14:37.312195] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6e2f0 00:13:49.442 [2024-05-15 04:14:37.312363] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a759b0 00:13:49.442 [2024-05-15 04:14:37.312380] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a759b0 00:13:49.442 [2024-05-15 04:14:37.312500] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:49.442 pt3 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.442 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:49.699 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:49.699 "name": "raid_bdev1", 00:13:49.699 "uuid": "f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5", 00:13:49.699 "strip_size_kb": 0, 00:13:49.699 "state": "online", 00:13:49.699 "raid_level": "raid1", 00:13:49.699 "superblock": true, 00:13:49.699 "num_base_bdevs": 3, 00:13:49.699 "num_base_bdevs_discovered": 2, 00:13:49.699 "num_base_bdevs_operational": 2, 00:13:49.699 "base_bdevs_list": [ 00:13:49.699 { 00:13:49.699 "name": null, 00:13:49.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.699 "is_configured": false, 00:13:49.699 "data_offset": 2048, 00:13:49.699 "data_size": 63488 00:13:49.699 }, 00:13:49.699 { 00:13:49.699 "name": "pt2", 00:13:49.699 "uuid": "39e4c2d1-37eb-5342-bfc5-c595e17912ae", 00:13:49.699 "is_configured": true, 00:13:49.699 "data_offset": 2048, 00:13:49.699 "data_size": 63488 00:13:49.699 }, 00:13:49.699 { 00:13:49.699 "name": "pt3", 00:13:49.699 "uuid": "74b24a0e-9b0a-5b11-a1b2-3bcbd033fdfe", 00:13:49.699 "is_configured": true, 00:13:49.699 "data_offset": 2048, 00:13:49.699 "data_size": 63488 00:13:49.699 } 00:13:49.699 ] 00:13:49.699 }' 00:13:49.699 04:14:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:49.699 04:14:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.263 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:50.263 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:50.521 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:13:50.521 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:50.521 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:13:50.778 [2024-05-15 04:14:38.715068] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # '[' f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 '!=' f2e2e4b1-8b31-49c9-a650-d8f0a43dc1f5 ']' 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3866273 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3866273 ']' 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3866273 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3866273 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3866273' 00:13:50.778 killing process with pid 3866273 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3866273 00:13:50.778 [2024-05-15 04:14:38.759952] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:50.778 04:14:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3866273 00:13:50.778 [2024-05-15 04:14:38.760043] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.778 [2024-05-15 04:14:38.760124] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:50.778 [2024-05-15 04:14:38.760141] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a759b0 name raid_bdev1, state offline 00:13:51.036 [2024-05-15 04:14:38.796889] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:51.294 04:14:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:13:51.294 00:13:51.294 real 0m22.273s 00:13:51.294 user 0m41.412s 00:13:51.294 sys 0m3.000s 00:13:51.294 04:14:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:13:51.294 04:14:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.294 ************************************ 00:13:51.294 END TEST raid_superblock_test 00:13:51.294 ************************************ 00:13:51.294 04:14:39 bdev_raid -- bdev/bdev_raid.sh@801 -- # for n in {2..4} 00:13:51.294 04:14:39 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:13:51.294 04:14:39 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:13:51.294 04:14:39 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:13:51.294 04:14:39 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:13:51.294 04:14:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:51.294 ************************************ 00:13:51.294 START TEST raid_state_function_test 00:13:51.294 ************************************ 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 false 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3869381 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3869381' 00:13:51.294 Process raid pid: 3869381 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3869381 /var/tmp/spdk-raid.sock 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3869381 ']' 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:51.294 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:13:51.294 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.294 [2024-05-15 04:14:39.185547] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:13:51.294 [2024-05-15 04:14:39.185612] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:51.294 [2024-05-15 04:14:39.260275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:51.552 [2024-05-15 04:14:39.366957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.552 [2024-05-15 04:14:39.431503] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.552 [2024-05-15 04:14:39.431540] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.552 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:13:51.552 04:14:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:13:51.552 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:13:51.809 [2024-05-15 04:14:39.761705] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:51.809 [2024-05-15 04:14:39.761754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:51.809 [2024-05-15 04:14:39.761767] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:51.809 [2024-05-15 04:14:39.761781] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:51.809 [2024-05-15 04:14:39.761791] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:51.809 [2024-05-15 04:14:39.761802] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:51.810 [2024-05-15 04:14:39.761811] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:13:51.810 [2024-05-15 04:14:39.761835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.810 04:14:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.067 04:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:52.067 "name": "Existed_Raid", 00:13:52.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.067 "strip_size_kb": 64, 00:13:52.067 "state": "configuring", 00:13:52.067 "raid_level": "raid0", 00:13:52.067 "superblock": false, 00:13:52.067 "num_base_bdevs": 4, 00:13:52.067 "num_base_bdevs_discovered": 0, 00:13:52.067 "num_base_bdevs_operational": 4, 00:13:52.067 "base_bdevs_list": [ 00:13:52.067 { 00:13:52.067 "name": "BaseBdev1", 00:13:52.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.067 "is_configured": false, 00:13:52.067 "data_offset": 0, 00:13:52.067 "data_size": 0 00:13:52.067 }, 00:13:52.067 { 00:13:52.067 "name": "BaseBdev2", 00:13:52.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.067 "is_configured": false, 00:13:52.067 "data_offset": 0, 00:13:52.067 "data_size": 0 00:13:52.067 }, 00:13:52.067 { 00:13:52.067 "name": "BaseBdev3", 00:13:52.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.067 "is_configured": false, 00:13:52.067 "data_offset": 0, 00:13:52.067 "data_size": 0 00:13:52.067 }, 00:13:52.067 { 00:13:52.067 "name": "BaseBdev4", 00:13:52.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.067 "is_configured": false, 00:13:52.067 "data_offset": 0, 00:13:52.067 "data_size": 0 00:13:52.067 } 00:13:52.067 ] 00:13:52.067 }' 00:13:52.067 04:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:52.067 04:14:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:53.016 04:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.016 [2024-05-15 04:14:40.884538] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.016 [2024-05-15 04:14:40.884574] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d9040 name Existed_Raid, state configuring 00:13:53.016 04:14:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:13:53.321 [2024-05-15 04:14:41.117169] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:53.321 [2024-05-15 04:14:41.117209] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:53.321 [2024-05-15 04:14:41.117221] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.321 [2024-05-15 04:14:41.117235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.321 [2024-05-15 04:14:41.117244] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:53.321 [2024-05-15 04:14:41.117257] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:53.321 [2024-05-15 04:14:41.117266] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:13:53.321 [2024-05-15 04:14:41.117278] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:13:53.321 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:53.580 [2024-05-15 04:14:41.366429] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.580 BaseBdev1 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:53.580 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.838 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:53.838 [ 00:13:53.838 { 00:13:53.838 "name": "BaseBdev1", 00:13:53.838 "aliases": [ 00:13:53.838 "3ed897e3-98c0-44e0-a326-913e00a14951" 00:13:53.838 ], 00:13:53.838 "product_name": "Malloc disk", 00:13:53.838 "block_size": 512, 00:13:53.838 "num_blocks": 65536, 00:13:53.838 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:13:53.838 "assigned_rate_limits": { 00:13:53.838 "rw_ios_per_sec": 0, 00:13:53.838 "rw_mbytes_per_sec": 0, 00:13:53.838 "r_mbytes_per_sec": 0, 00:13:53.838 "w_mbytes_per_sec": 0 00:13:53.838 }, 00:13:53.838 "claimed": true, 00:13:53.838 "claim_type": "exclusive_write", 00:13:53.838 "zoned": false, 00:13:53.838 "supported_io_types": { 00:13:53.838 "read": true, 00:13:53.838 "write": true, 00:13:53.838 "unmap": true, 00:13:53.838 "write_zeroes": true, 00:13:53.838 "flush": true, 00:13:53.838 "reset": true, 00:13:53.838 "compare": false, 00:13:53.838 "compare_and_write": false, 00:13:53.838 "abort": true, 00:13:53.838 "nvme_admin": false, 00:13:53.838 "nvme_io": false 00:13:53.838 }, 00:13:53.838 "memory_domains": [ 00:13:53.838 { 00:13:53.838 "dma_device_id": "system", 00:13:53.838 "dma_device_type": 1 00:13:53.838 }, 00:13:53.838 { 00:13:53.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.838 "dma_device_type": 2 00:13:53.838 } 00:13:53.838 ], 00:13:53.838 "driver_specific": {} 00:13:53.838 } 00:13:53.838 ] 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.096 04:14:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.096 04:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:54.096 "name": "Existed_Raid", 00:13:54.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.096 "strip_size_kb": 64, 00:13:54.096 "state": "configuring", 00:13:54.096 "raid_level": "raid0", 00:13:54.096 "superblock": false, 00:13:54.096 "num_base_bdevs": 4, 00:13:54.096 "num_base_bdevs_discovered": 1, 00:13:54.096 "num_base_bdevs_operational": 4, 00:13:54.096 "base_bdevs_list": [ 00:13:54.096 { 00:13:54.096 "name": "BaseBdev1", 00:13:54.096 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:13:54.096 "is_configured": true, 00:13:54.096 "data_offset": 0, 00:13:54.096 "data_size": 65536 00:13:54.096 }, 00:13:54.096 { 00:13:54.096 "name": "BaseBdev2", 00:13:54.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.096 "is_configured": false, 00:13:54.096 "data_offset": 0, 00:13:54.096 "data_size": 0 00:13:54.096 }, 00:13:54.096 { 00:13:54.096 "name": "BaseBdev3", 00:13:54.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.096 "is_configured": false, 00:13:54.096 "data_offset": 0, 00:13:54.096 "data_size": 0 00:13:54.096 }, 00:13:54.096 { 00:13:54.096 "name": "BaseBdev4", 00:13:54.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.096 "is_configured": false, 00:13:54.096 "data_offset": 0, 00:13:54.096 "data_size": 0 00:13:54.096 } 00:13:54.096 ] 00:13:54.096 }' 00:13:54.096 04:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:54.096 04:14:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.660 04:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:54.918 [2024-05-15 04:14:42.854336] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:54.918 [2024-05-15 04:14:42.854398] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d88b0 name Existed_Raid, state configuring 00:13:54.918 04:14:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:13:55.175 [2024-05-15 04:14:43.095002] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:55.175 [2024-05-15 04:14:43.096499] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:55.175 [2024-05-15 04:14:43.096533] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:55.175 [2024-05-15 04:14:43.096556] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:55.175 [2024-05-15 04:14:43.096570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:55.175 [2024-05-15 04:14:43.096579] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:13:55.175 [2024-05-15 04:14:43.096592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.175 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.433 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:55.433 "name": "Existed_Raid", 00:13:55.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.433 "strip_size_kb": 64, 00:13:55.433 "state": "configuring", 00:13:55.433 "raid_level": "raid0", 00:13:55.433 "superblock": false, 00:13:55.433 "num_base_bdevs": 4, 00:13:55.433 "num_base_bdevs_discovered": 1, 00:13:55.433 "num_base_bdevs_operational": 4, 00:13:55.433 "base_bdevs_list": [ 00:13:55.433 { 00:13:55.433 "name": "BaseBdev1", 00:13:55.433 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:13:55.433 "is_configured": true, 00:13:55.433 "data_offset": 0, 00:13:55.433 "data_size": 65536 00:13:55.433 }, 00:13:55.433 { 00:13:55.433 "name": "BaseBdev2", 00:13:55.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.433 "is_configured": false, 00:13:55.433 "data_offset": 0, 00:13:55.433 "data_size": 0 00:13:55.433 }, 00:13:55.433 { 00:13:55.433 "name": "BaseBdev3", 00:13:55.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.433 "is_configured": false, 00:13:55.433 "data_offset": 0, 00:13:55.433 "data_size": 0 00:13:55.433 }, 00:13:55.433 { 00:13:55.433 "name": "BaseBdev4", 00:13:55.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.433 "is_configured": false, 00:13:55.433 "data_offset": 0, 00:13:55.433 "data_size": 0 00:13:55.433 } 00:13:55.433 ] 00:13:55.433 }' 00:13:55.433 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:55.433 04:14:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:55.997 04:14:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:56.254 [2024-05-15 04:14:44.163758] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.254 BaseBdev2 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:56.254 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.511 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:56.769 [ 00:13:56.769 { 00:13:56.769 "name": "BaseBdev2", 00:13:56.769 "aliases": [ 00:13:56.769 "7df1515f-b648-4d10-b1a1-1c99b3c08a4d" 00:13:56.769 ], 00:13:56.769 "product_name": "Malloc disk", 00:13:56.769 "block_size": 512, 00:13:56.769 "num_blocks": 65536, 00:13:56.769 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:13:56.769 "assigned_rate_limits": { 00:13:56.769 "rw_ios_per_sec": 0, 00:13:56.769 "rw_mbytes_per_sec": 0, 00:13:56.769 "r_mbytes_per_sec": 0, 00:13:56.769 "w_mbytes_per_sec": 0 00:13:56.769 }, 00:13:56.769 "claimed": true, 00:13:56.769 "claim_type": "exclusive_write", 00:13:56.769 "zoned": false, 00:13:56.769 "supported_io_types": { 00:13:56.769 "read": true, 00:13:56.769 "write": true, 00:13:56.769 "unmap": true, 00:13:56.769 "write_zeroes": true, 00:13:56.769 "flush": true, 00:13:56.769 "reset": true, 00:13:56.769 "compare": false, 00:13:56.769 "compare_and_write": false, 00:13:56.769 "abort": true, 00:13:56.769 "nvme_admin": false, 00:13:56.769 "nvme_io": false 00:13:56.769 }, 00:13:56.769 "memory_domains": [ 00:13:56.769 { 00:13:56.769 "dma_device_id": "system", 00:13:56.769 "dma_device_type": 1 00:13:56.769 }, 00:13:56.769 { 00:13:56.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.769 "dma_device_type": 2 00:13:56.769 } 00:13:56.769 ], 00:13:56.769 "driver_specific": {} 00:13:56.769 } 00:13:56.769 ] 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.769 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.027 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:57.027 "name": "Existed_Raid", 00:13:57.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.027 "strip_size_kb": 64, 00:13:57.027 "state": "configuring", 00:13:57.027 "raid_level": "raid0", 00:13:57.027 "superblock": false, 00:13:57.027 "num_base_bdevs": 4, 00:13:57.027 "num_base_bdevs_discovered": 2, 00:13:57.027 "num_base_bdevs_operational": 4, 00:13:57.027 "base_bdevs_list": [ 00:13:57.027 { 00:13:57.027 "name": "BaseBdev1", 00:13:57.027 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:13:57.027 "is_configured": true, 00:13:57.027 "data_offset": 0, 00:13:57.027 "data_size": 65536 00:13:57.027 }, 00:13:57.027 { 00:13:57.027 "name": "BaseBdev2", 00:13:57.027 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:13:57.027 "is_configured": true, 00:13:57.027 "data_offset": 0, 00:13:57.027 "data_size": 65536 00:13:57.027 }, 00:13:57.027 { 00:13:57.027 "name": "BaseBdev3", 00:13:57.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.027 "is_configured": false, 00:13:57.027 "data_offset": 0, 00:13:57.027 "data_size": 0 00:13:57.027 }, 00:13:57.027 { 00:13:57.027 "name": "BaseBdev4", 00:13:57.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.027 "is_configured": false, 00:13:57.027 "data_offset": 0, 00:13:57.027 "data_size": 0 00:13:57.027 } 00:13:57.027 ] 00:13:57.027 }' 00:13:57.027 04:14:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:57.027 04:14:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.593 04:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:57.851 [2024-05-15 04:14:45.741510] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:57.851 BaseBdev3 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:57.851 04:14:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.109 04:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:58.367 [ 00:13:58.367 { 00:13:58.367 "name": "BaseBdev3", 00:13:58.367 "aliases": [ 00:13:58.367 "04017058-f0b0-43e5-bcdb-ab765d166926" 00:13:58.367 ], 00:13:58.367 "product_name": "Malloc disk", 00:13:58.367 "block_size": 512, 00:13:58.367 "num_blocks": 65536, 00:13:58.367 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:13:58.367 "assigned_rate_limits": { 00:13:58.367 "rw_ios_per_sec": 0, 00:13:58.367 "rw_mbytes_per_sec": 0, 00:13:58.367 "r_mbytes_per_sec": 0, 00:13:58.367 "w_mbytes_per_sec": 0 00:13:58.367 }, 00:13:58.367 "claimed": true, 00:13:58.367 "claim_type": "exclusive_write", 00:13:58.367 "zoned": false, 00:13:58.367 "supported_io_types": { 00:13:58.367 "read": true, 00:13:58.367 "write": true, 00:13:58.367 "unmap": true, 00:13:58.367 "write_zeroes": true, 00:13:58.367 "flush": true, 00:13:58.367 "reset": true, 00:13:58.367 "compare": false, 00:13:58.367 "compare_and_write": false, 00:13:58.367 "abort": true, 00:13:58.367 "nvme_admin": false, 00:13:58.367 "nvme_io": false 00:13:58.367 }, 00:13:58.367 "memory_domains": [ 00:13:58.367 { 00:13:58.367 "dma_device_id": "system", 00:13:58.367 "dma_device_type": 1 00:13:58.367 }, 00:13:58.367 { 00:13:58.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.367 "dma_device_type": 2 00:13:58.367 } 00:13:58.367 ], 00:13:58.367 "driver_specific": {} 00:13:58.367 } 00:13:58.367 ] 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.367 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:58.626 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:13:58.626 "name": "Existed_Raid", 00:13:58.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.626 "strip_size_kb": 64, 00:13:58.626 "state": "configuring", 00:13:58.626 "raid_level": "raid0", 00:13:58.626 "superblock": false, 00:13:58.626 "num_base_bdevs": 4, 00:13:58.626 "num_base_bdevs_discovered": 3, 00:13:58.626 "num_base_bdevs_operational": 4, 00:13:58.626 "base_bdevs_list": [ 00:13:58.626 { 00:13:58.626 "name": "BaseBdev1", 00:13:58.626 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:13:58.626 "is_configured": true, 00:13:58.626 "data_offset": 0, 00:13:58.626 "data_size": 65536 00:13:58.626 }, 00:13:58.626 { 00:13:58.626 "name": "BaseBdev2", 00:13:58.626 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:13:58.626 "is_configured": true, 00:13:58.626 "data_offset": 0, 00:13:58.626 "data_size": 65536 00:13:58.626 }, 00:13:58.626 { 00:13:58.626 "name": "BaseBdev3", 00:13:58.626 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:13:58.626 "is_configured": true, 00:13:58.626 "data_offset": 0, 00:13:58.626 "data_size": 65536 00:13:58.626 }, 00:13:58.626 { 00:13:58.626 "name": "BaseBdev4", 00:13:58.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:58.626 "is_configured": false, 00:13:58.626 "data_offset": 0, 00:13:58.626 "data_size": 0 00:13:58.626 } 00:13:58.626 ] 00:13:58.626 }' 00:13:58.626 04:14:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:13:58.626 04:14:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:59.191 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:13:59.449 [2024-05-15 04:14:47.383694] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:13:59.449 [2024-05-15 04:14:47.383753] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x17d97f0 00:13:59.449 [2024-05-15 04:14:47.383764] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:13:59.449 [2024-05-15 04:14:47.383986] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x198d5f0 00:13:59.449 [2024-05-15 04:14:47.384157] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17d97f0 00:13:59.449 [2024-05-15 04:14:47.384173] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x17d97f0 00:13:59.449 [2024-05-15 04:14:47.384399] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:59.449 BaseBdev4 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:13:59.449 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:59.718 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:13:59.978 [ 00:13:59.978 { 00:13:59.978 "name": "BaseBdev4", 00:13:59.978 "aliases": [ 00:13:59.978 "5f60669a-eb06-4966-bf54-08a4842cf64f" 00:13:59.978 ], 00:13:59.978 "product_name": "Malloc disk", 00:13:59.978 "block_size": 512, 00:13:59.978 "num_blocks": 65536, 00:13:59.978 "uuid": "5f60669a-eb06-4966-bf54-08a4842cf64f", 00:13:59.978 "assigned_rate_limits": { 00:13:59.978 "rw_ios_per_sec": 0, 00:13:59.978 "rw_mbytes_per_sec": 0, 00:13:59.978 "r_mbytes_per_sec": 0, 00:13:59.978 "w_mbytes_per_sec": 0 00:13:59.978 }, 00:13:59.978 "claimed": true, 00:13:59.978 "claim_type": "exclusive_write", 00:13:59.978 "zoned": false, 00:13:59.978 "supported_io_types": { 00:13:59.978 "read": true, 00:13:59.978 "write": true, 00:13:59.978 "unmap": true, 00:13:59.978 "write_zeroes": true, 00:13:59.978 "flush": true, 00:13:59.978 "reset": true, 00:13:59.978 "compare": false, 00:13:59.978 "compare_and_write": false, 00:13:59.978 "abort": true, 00:13:59.978 "nvme_admin": false, 00:13:59.978 "nvme_io": false 00:13:59.978 }, 00:13:59.978 "memory_domains": [ 00:13:59.978 { 00:13:59.978 "dma_device_id": "system", 00:13:59.978 "dma_device_type": 1 00:13:59.978 }, 00:13:59.978 { 00:13:59.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.978 "dma_device_type": 2 00:13:59.978 } 00:13:59.978 ], 00:13:59.978 "driver_specific": {} 00:13:59.978 } 00:13:59.978 ] 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.978 04:14:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.236 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:00.236 "name": "Existed_Raid", 00:14:00.236 "uuid": "a9968478-393b-4396-848e-4e11c50966ad", 00:14:00.236 "strip_size_kb": 64, 00:14:00.236 "state": "online", 00:14:00.236 "raid_level": "raid0", 00:14:00.236 "superblock": false, 00:14:00.236 "num_base_bdevs": 4, 00:14:00.236 "num_base_bdevs_discovered": 4, 00:14:00.236 "num_base_bdevs_operational": 4, 00:14:00.236 "base_bdevs_list": [ 00:14:00.236 { 00:14:00.236 "name": "BaseBdev1", 00:14:00.236 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:14:00.236 "is_configured": true, 00:14:00.236 "data_offset": 0, 00:14:00.236 "data_size": 65536 00:14:00.236 }, 00:14:00.236 { 00:14:00.236 "name": "BaseBdev2", 00:14:00.236 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:14:00.236 "is_configured": true, 00:14:00.236 "data_offset": 0, 00:14:00.236 "data_size": 65536 00:14:00.236 }, 00:14:00.236 { 00:14:00.236 "name": "BaseBdev3", 00:14:00.236 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:14:00.236 "is_configured": true, 00:14:00.236 "data_offset": 0, 00:14:00.236 "data_size": 65536 00:14:00.236 }, 00:14:00.236 { 00:14:00.236 "name": "BaseBdev4", 00:14:00.236 "uuid": "5f60669a-eb06-4966-bf54-08a4842cf64f", 00:14:00.236 "is_configured": true, 00:14:00.236 "data_offset": 0, 00:14:00.236 "data_size": 65536 00:14:00.236 } 00:14:00.236 ] 00:14:00.236 }' 00:14:00.236 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:00.236 04:14:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:00.800 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:01.057 [2024-05-15 04:14:48.879968] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:01.057 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:01.057 "name": "Existed_Raid", 00:14:01.057 "aliases": [ 00:14:01.057 "a9968478-393b-4396-848e-4e11c50966ad" 00:14:01.057 ], 00:14:01.057 "product_name": "Raid Volume", 00:14:01.057 "block_size": 512, 00:14:01.057 "num_blocks": 262144, 00:14:01.057 "uuid": "a9968478-393b-4396-848e-4e11c50966ad", 00:14:01.057 "assigned_rate_limits": { 00:14:01.058 "rw_ios_per_sec": 0, 00:14:01.058 "rw_mbytes_per_sec": 0, 00:14:01.058 "r_mbytes_per_sec": 0, 00:14:01.058 "w_mbytes_per_sec": 0 00:14:01.058 }, 00:14:01.058 "claimed": false, 00:14:01.058 "zoned": false, 00:14:01.058 "supported_io_types": { 00:14:01.058 "read": true, 00:14:01.058 "write": true, 00:14:01.058 "unmap": true, 00:14:01.058 "write_zeroes": true, 00:14:01.058 "flush": true, 00:14:01.058 "reset": true, 00:14:01.058 "compare": false, 00:14:01.058 "compare_and_write": false, 00:14:01.058 "abort": false, 00:14:01.058 "nvme_admin": false, 00:14:01.058 "nvme_io": false 00:14:01.058 }, 00:14:01.058 "memory_domains": [ 00:14:01.058 { 00:14:01.058 "dma_device_id": "system", 00:14:01.058 "dma_device_type": 1 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.058 "dma_device_type": 2 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "system", 00:14:01.058 "dma_device_type": 1 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.058 "dma_device_type": 2 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "system", 00:14:01.058 "dma_device_type": 1 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.058 "dma_device_type": 2 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "system", 00:14:01.058 "dma_device_type": 1 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.058 "dma_device_type": 2 00:14:01.058 } 00:14:01.058 ], 00:14:01.058 "driver_specific": { 00:14:01.058 "raid": { 00:14:01.058 "uuid": "a9968478-393b-4396-848e-4e11c50966ad", 00:14:01.058 "strip_size_kb": 64, 00:14:01.058 "state": "online", 00:14:01.058 "raid_level": "raid0", 00:14:01.058 "superblock": false, 00:14:01.058 "num_base_bdevs": 4, 00:14:01.058 "num_base_bdevs_discovered": 4, 00:14:01.058 "num_base_bdevs_operational": 4, 00:14:01.058 "base_bdevs_list": [ 00:14:01.058 { 00:14:01.058 "name": "BaseBdev1", 00:14:01.058 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:14:01.058 "is_configured": true, 00:14:01.058 "data_offset": 0, 00:14:01.058 "data_size": 65536 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "name": "BaseBdev2", 00:14:01.058 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:14:01.058 "is_configured": true, 00:14:01.058 "data_offset": 0, 00:14:01.058 "data_size": 65536 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "name": "BaseBdev3", 00:14:01.058 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:14:01.058 "is_configured": true, 00:14:01.058 "data_offset": 0, 00:14:01.058 "data_size": 65536 00:14:01.058 }, 00:14:01.058 { 00:14:01.058 "name": "BaseBdev4", 00:14:01.058 "uuid": "5f60669a-eb06-4966-bf54-08a4842cf64f", 00:14:01.058 "is_configured": true, 00:14:01.058 "data_offset": 0, 00:14:01.058 "data_size": 65536 00:14:01.058 } 00:14:01.058 ] 00:14:01.058 } 00:14:01.058 } 00:14:01.058 }' 00:14:01.058 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:01.058 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:14:01.058 BaseBdev2 00:14:01.058 BaseBdev3 00:14:01.058 BaseBdev4' 00:14:01.058 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:01.058 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:01.058 04:14:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:01.314 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:01.314 "name": "BaseBdev1", 00:14:01.314 "aliases": [ 00:14:01.314 "3ed897e3-98c0-44e0-a326-913e00a14951" 00:14:01.314 ], 00:14:01.314 "product_name": "Malloc disk", 00:14:01.314 "block_size": 512, 00:14:01.314 "num_blocks": 65536, 00:14:01.314 "uuid": "3ed897e3-98c0-44e0-a326-913e00a14951", 00:14:01.314 "assigned_rate_limits": { 00:14:01.314 "rw_ios_per_sec": 0, 00:14:01.314 "rw_mbytes_per_sec": 0, 00:14:01.314 "r_mbytes_per_sec": 0, 00:14:01.314 "w_mbytes_per_sec": 0 00:14:01.314 }, 00:14:01.314 "claimed": true, 00:14:01.314 "claim_type": "exclusive_write", 00:14:01.314 "zoned": false, 00:14:01.314 "supported_io_types": { 00:14:01.314 "read": true, 00:14:01.314 "write": true, 00:14:01.314 "unmap": true, 00:14:01.314 "write_zeroes": true, 00:14:01.314 "flush": true, 00:14:01.314 "reset": true, 00:14:01.314 "compare": false, 00:14:01.314 "compare_and_write": false, 00:14:01.314 "abort": true, 00:14:01.314 "nvme_admin": false, 00:14:01.314 "nvme_io": false 00:14:01.314 }, 00:14:01.314 "memory_domains": [ 00:14:01.314 { 00:14:01.315 "dma_device_id": "system", 00:14:01.315 "dma_device_type": 1 00:14:01.315 }, 00:14:01.315 { 00:14:01.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.315 "dma_device_type": 2 00:14:01.315 } 00:14:01.315 ], 00:14:01.315 "driver_specific": {} 00:14:01.315 }' 00:14:01.315 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.315 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.315 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:01.315 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:01.571 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:01.827 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:01.827 "name": "BaseBdev2", 00:14:01.827 "aliases": [ 00:14:01.827 "7df1515f-b648-4d10-b1a1-1c99b3c08a4d" 00:14:01.827 ], 00:14:01.827 "product_name": "Malloc disk", 00:14:01.827 "block_size": 512, 00:14:01.827 "num_blocks": 65536, 00:14:01.827 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:14:01.827 "assigned_rate_limits": { 00:14:01.827 "rw_ios_per_sec": 0, 00:14:01.827 "rw_mbytes_per_sec": 0, 00:14:01.827 "r_mbytes_per_sec": 0, 00:14:01.827 "w_mbytes_per_sec": 0 00:14:01.827 }, 00:14:01.827 "claimed": true, 00:14:01.827 "claim_type": "exclusive_write", 00:14:01.827 "zoned": false, 00:14:01.827 "supported_io_types": { 00:14:01.827 "read": true, 00:14:01.827 "write": true, 00:14:01.827 "unmap": true, 00:14:01.827 "write_zeroes": true, 00:14:01.827 "flush": true, 00:14:01.827 "reset": true, 00:14:01.827 "compare": false, 00:14:01.827 "compare_and_write": false, 00:14:01.828 "abort": true, 00:14:01.828 "nvme_admin": false, 00:14:01.828 "nvme_io": false 00:14:01.828 }, 00:14:01.828 "memory_domains": [ 00:14:01.828 { 00:14:01.828 "dma_device_id": "system", 00:14:01.828 "dma_device_type": 1 00:14:01.828 }, 00:14:01.828 { 00:14:01.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.828 "dma_device_type": 2 00:14:01.828 } 00:14:01.828 ], 00:14:01.828 "driver_specific": {} 00:14:01.828 }' 00:14:01.828 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:01.828 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:02.084 04:14:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:02.084 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:02.341 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:02.341 "name": "BaseBdev3", 00:14:02.341 "aliases": [ 00:14:02.341 "04017058-f0b0-43e5-bcdb-ab765d166926" 00:14:02.341 ], 00:14:02.341 "product_name": "Malloc disk", 00:14:02.341 "block_size": 512, 00:14:02.341 "num_blocks": 65536, 00:14:02.341 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:14:02.341 "assigned_rate_limits": { 00:14:02.341 "rw_ios_per_sec": 0, 00:14:02.341 "rw_mbytes_per_sec": 0, 00:14:02.341 "r_mbytes_per_sec": 0, 00:14:02.341 "w_mbytes_per_sec": 0 00:14:02.341 }, 00:14:02.341 "claimed": true, 00:14:02.341 "claim_type": "exclusive_write", 00:14:02.341 "zoned": false, 00:14:02.341 "supported_io_types": { 00:14:02.341 "read": true, 00:14:02.341 "write": true, 00:14:02.341 "unmap": true, 00:14:02.341 "write_zeroes": true, 00:14:02.341 "flush": true, 00:14:02.341 "reset": true, 00:14:02.341 "compare": false, 00:14:02.341 "compare_and_write": false, 00:14:02.341 "abort": true, 00:14:02.341 "nvme_admin": false, 00:14:02.341 "nvme_io": false 00:14:02.341 }, 00:14:02.341 "memory_domains": [ 00:14:02.341 { 00:14:02.341 "dma_device_id": "system", 00:14:02.341 "dma_device_type": 1 00:14:02.341 }, 00:14:02.341 { 00:14:02.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:02.341 "dma_device_type": 2 00:14:02.341 } 00:14:02.341 ], 00:14:02.341 "driver_specific": {} 00:14:02.341 }' 00:14:02.341 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:02.598 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:02.855 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:02.855 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:02.855 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:02.855 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:03.113 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:03.113 "name": "BaseBdev4", 00:14:03.113 "aliases": [ 00:14:03.113 "5f60669a-eb06-4966-bf54-08a4842cf64f" 00:14:03.113 ], 00:14:03.113 "product_name": "Malloc disk", 00:14:03.113 "block_size": 512, 00:14:03.113 "num_blocks": 65536, 00:14:03.113 "uuid": "5f60669a-eb06-4966-bf54-08a4842cf64f", 00:14:03.113 "assigned_rate_limits": { 00:14:03.113 "rw_ios_per_sec": 0, 00:14:03.113 "rw_mbytes_per_sec": 0, 00:14:03.113 "r_mbytes_per_sec": 0, 00:14:03.113 "w_mbytes_per_sec": 0 00:14:03.113 }, 00:14:03.113 "claimed": true, 00:14:03.113 "claim_type": "exclusive_write", 00:14:03.113 "zoned": false, 00:14:03.113 "supported_io_types": { 00:14:03.113 "read": true, 00:14:03.113 "write": true, 00:14:03.113 "unmap": true, 00:14:03.113 "write_zeroes": true, 00:14:03.113 "flush": true, 00:14:03.113 "reset": true, 00:14:03.113 "compare": false, 00:14:03.113 "compare_and_write": false, 00:14:03.113 "abort": true, 00:14:03.113 "nvme_admin": false, 00:14:03.113 "nvme_io": false 00:14:03.113 }, 00:14:03.113 "memory_domains": [ 00:14:03.113 { 00:14:03.113 "dma_device_id": "system", 00:14:03.113 "dma_device_type": 1 00:14:03.113 }, 00:14:03.113 { 00:14:03.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.113 "dma_device_type": 2 00:14:03.113 } 00:14:03.113 ], 00:14:03.113 "driver_specific": {} 00:14:03.113 }' 00:14:03.113 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:03.113 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:03.113 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:03.113 04:14:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:03.113 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:03.370 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:03.370 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:03.370 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:03.629 [2024-05-15 04:14:51.414518] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:03.629 [2024-05-15 04:14:51.414551] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:03.629 [2024-05-15 04:14:51.414609] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.629 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.886 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:03.886 "name": "Existed_Raid", 00:14:03.886 "uuid": "a9968478-393b-4396-848e-4e11c50966ad", 00:14:03.886 "strip_size_kb": 64, 00:14:03.886 "state": "offline", 00:14:03.886 "raid_level": "raid0", 00:14:03.886 "superblock": false, 00:14:03.886 "num_base_bdevs": 4, 00:14:03.886 "num_base_bdevs_discovered": 3, 00:14:03.886 "num_base_bdevs_operational": 3, 00:14:03.886 "base_bdevs_list": [ 00:14:03.886 { 00:14:03.886 "name": null, 00:14:03.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:03.886 "is_configured": false, 00:14:03.886 "data_offset": 0, 00:14:03.886 "data_size": 65536 00:14:03.886 }, 00:14:03.886 { 00:14:03.886 "name": "BaseBdev2", 00:14:03.886 "uuid": "7df1515f-b648-4d10-b1a1-1c99b3c08a4d", 00:14:03.886 "is_configured": true, 00:14:03.886 "data_offset": 0, 00:14:03.886 "data_size": 65536 00:14:03.886 }, 00:14:03.886 { 00:14:03.886 "name": "BaseBdev3", 00:14:03.886 "uuid": "04017058-f0b0-43e5-bcdb-ab765d166926", 00:14:03.886 "is_configured": true, 00:14:03.886 "data_offset": 0, 00:14:03.886 "data_size": 65536 00:14:03.886 }, 00:14:03.887 { 00:14:03.887 "name": "BaseBdev4", 00:14:03.887 "uuid": "5f60669a-eb06-4966-bf54-08a4842cf64f", 00:14:03.887 "is_configured": true, 00:14:03.887 "data_offset": 0, 00:14:03.887 "data_size": 65536 00:14:03.887 } 00:14:03.887 ] 00:14:03.887 }' 00:14:03.887 04:14:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:03.887 04:14:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:04.451 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:14:04.451 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:04.451 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.451 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:04.708 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:04.708 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:04.708 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:04.966 [2024-05-15 04:14:52.780280] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:04.966 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:04.966 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:04.966 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.966 04:14:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:05.223 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:05.223 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:05.223 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:05.480 [2024-05-15 04:14:53.322170] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:05.480 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:05.480 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:05.480 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.480 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:05.736 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:05.736 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:05.736 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:05.994 [2024-05-15 04:14:53.901429] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:05.994 [2024-05-15 04:14:53.901488] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17d97f0 name Existed_Raid, state offline 00:14:05.994 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:05.994 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:05.994 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.994 04:14:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:06.251 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:06.509 BaseBdev2 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:06.509 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:06.766 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:07.023 [ 00:14:07.023 { 00:14:07.023 "name": "BaseBdev2", 00:14:07.023 "aliases": [ 00:14:07.023 "014d67ea-f6ed-4fb9-9278-57210753401d" 00:14:07.023 ], 00:14:07.023 "product_name": "Malloc disk", 00:14:07.023 "block_size": 512, 00:14:07.023 "num_blocks": 65536, 00:14:07.023 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:07.023 "assigned_rate_limits": { 00:14:07.023 "rw_ios_per_sec": 0, 00:14:07.023 "rw_mbytes_per_sec": 0, 00:14:07.023 "r_mbytes_per_sec": 0, 00:14:07.023 "w_mbytes_per_sec": 0 00:14:07.023 }, 00:14:07.023 "claimed": false, 00:14:07.023 "zoned": false, 00:14:07.023 "supported_io_types": { 00:14:07.023 "read": true, 00:14:07.023 "write": true, 00:14:07.023 "unmap": true, 00:14:07.023 "write_zeroes": true, 00:14:07.023 "flush": true, 00:14:07.023 "reset": true, 00:14:07.023 "compare": false, 00:14:07.023 "compare_and_write": false, 00:14:07.023 "abort": true, 00:14:07.023 "nvme_admin": false, 00:14:07.023 "nvme_io": false 00:14:07.023 }, 00:14:07.023 "memory_domains": [ 00:14:07.023 { 00:14:07.023 "dma_device_id": "system", 00:14:07.023 "dma_device_type": 1 00:14:07.023 }, 00:14:07.023 { 00:14:07.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.023 "dma_device_type": 2 00:14:07.023 } 00:14:07.023 ], 00:14:07.023 "driver_specific": {} 00:14:07.023 } 00:14:07.023 ] 00:14:07.023 04:14:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:07.023 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:07.023 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:07.024 04:14:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:07.281 BaseBdev3 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:07.281 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.538 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:07.796 [ 00:14:07.796 { 00:14:07.796 "name": "BaseBdev3", 00:14:07.796 "aliases": [ 00:14:07.796 "06dc4ca3-5357-4d72-b83a-970a65e7bd0c" 00:14:07.796 ], 00:14:07.796 "product_name": "Malloc disk", 00:14:07.796 "block_size": 512, 00:14:07.796 "num_blocks": 65536, 00:14:07.796 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:07.796 "assigned_rate_limits": { 00:14:07.796 "rw_ios_per_sec": 0, 00:14:07.796 "rw_mbytes_per_sec": 0, 00:14:07.796 "r_mbytes_per_sec": 0, 00:14:07.796 "w_mbytes_per_sec": 0 00:14:07.796 }, 00:14:07.796 "claimed": false, 00:14:07.796 "zoned": false, 00:14:07.796 "supported_io_types": { 00:14:07.796 "read": true, 00:14:07.796 "write": true, 00:14:07.796 "unmap": true, 00:14:07.796 "write_zeroes": true, 00:14:07.796 "flush": true, 00:14:07.796 "reset": true, 00:14:07.796 "compare": false, 00:14:07.796 "compare_and_write": false, 00:14:07.796 "abort": true, 00:14:07.796 "nvme_admin": false, 00:14:07.796 "nvme_io": false 00:14:07.796 }, 00:14:07.796 "memory_domains": [ 00:14:07.796 { 00:14:07.796 "dma_device_id": "system", 00:14:07.796 "dma_device_type": 1 00:14:07.796 }, 00:14:07.796 { 00:14:07.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.796 "dma_device_type": 2 00:14:07.796 } 00:14:07.796 ], 00:14:07.796 "driver_specific": {} 00:14:07.796 } 00:14:07.796 ] 00:14:07.796 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:07.796 04:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:07.796 04:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:07.796 04:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:08.053 BaseBdev4 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:08.053 04:14:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.309 04:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:08.567 [ 00:14:08.567 { 00:14:08.567 "name": "BaseBdev4", 00:14:08.567 "aliases": [ 00:14:08.567 "e86eb3de-101f-45b6-a852-9fc9720d563a" 00:14:08.567 ], 00:14:08.567 "product_name": "Malloc disk", 00:14:08.567 "block_size": 512, 00:14:08.567 "num_blocks": 65536, 00:14:08.567 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:08.567 "assigned_rate_limits": { 00:14:08.567 "rw_ios_per_sec": 0, 00:14:08.567 "rw_mbytes_per_sec": 0, 00:14:08.567 "r_mbytes_per_sec": 0, 00:14:08.567 "w_mbytes_per_sec": 0 00:14:08.567 }, 00:14:08.567 "claimed": false, 00:14:08.567 "zoned": false, 00:14:08.567 "supported_io_types": { 00:14:08.567 "read": true, 00:14:08.567 "write": true, 00:14:08.567 "unmap": true, 00:14:08.567 "write_zeroes": true, 00:14:08.567 "flush": true, 00:14:08.567 "reset": true, 00:14:08.567 "compare": false, 00:14:08.567 "compare_and_write": false, 00:14:08.567 "abort": true, 00:14:08.567 "nvme_admin": false, 00:14:08.567 "nvme_io": false 00:14:08.567 }, 00:14:08.567 "memory_domains": [ 00:14:08.567 { 00:14:08.567 "dma_device_id": "system", 00:14:08.567 "dma_device_type": 1 00:14:08.567 }, 00:14:08.567 { 00:14:08.567 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.567 "dma_device_type": 2 00:14:08.567 } 00:14:08.567 ], 00:14:08.567 "driver_specific": {} 00:14:08.567 } 00:14:08.567 ] 00:14:08.567 04:14:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:08.567 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:08.567 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:08.567 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:08.825 [2024-05-15 04:14:56.744439] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:08.825 [2024-05-15 04:14:56.744488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:08.825 [2024-05-15 04:14:56.744516] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:08.825 [2024-05-15 04:14:56.746022] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:08.825 [2024-05-15 04:14:56.746075] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.825 04:14:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.084 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:09.084 "name": "Existed_Raid", 00:14:09.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.084 "strip_size_kb": 64, 00:14:09.084 "state": "configuring", 00:14:09.084 "raid_level": "raid0", 00:14:09.084 "superblock": false, 00:14:09.084 "num_base_bdevs": 4, 00:14:09.084 "num_base_bdevs_discovered": 3, 00:14:09.084 "num_base_bdevs_operational": 4, 00:14:09.084 "base_bdevs_list": [ 00:14:09.084 { 00:14:09.084 "name": "BaseBdev1", 00:14:09.084 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.084 "is_configured": false, 00:14:09.084 "data_offset": 0, 00:14:09.084 "data_size": 0 00:14:09.084 }, 00:14:09.084 { 00:14:09.084 "name": "BaseBdev2", 00:14:09.084 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:09.084 "is_configured": true, 00:14:09.084 "data_offset": 0, 00:14:09.084 "data_size": 65536 00:14:09.084 }, 00:14:09.084 { 00:14:09.084 "name": "BaseBdev3", 00:14:09.084 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:09.084 "is_configured": true, 00:14:09.084 "data_offset": 0, 00:14:09.084 "data_size": 65536 00:14:09.084 }, 00:14:09.084 { 00:14:09.084 "name": "BaseBdev4", 00:14:09.084 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:09.084 "is_configured": true, 00:14:09.084 "data_offset": 0, 00:14:09.084 "data_size": 65536 00:14:09.084 } 00:14:09.084 ] 00:14:09.084 }' 00:14:09.084 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:09.084 04:14:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.648 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:09.905 [2024-05-15 04:14:57.919517] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:10.162 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.163 04:14:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.420 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:10.420 "name": "Existed_Raid", 00:14:10.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.420 "strip_size_kb": 64, 00:14:10.420 "state": "configuring", 00:14:10.420 "raid_level": "raid0", 00:14:10.420 "superblock": false, 00:14:10.420 "num_base_bdevs": 4, 00:14:10.420 "num_base_bdevs_discovered": 2, 00:14:10.420 "num_base_bdevs_operational": 4, 00:14:10.420 "base_bdevs_list": [ 00:14:10.420 { 00:14:10.420 "name": "BaseBdev1", 00:14:10.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.420 "is_configured": false, 00:14:10.420 "data_offset": 0, 00:14:10.420 "data_size": 0 00:14:10.420 }, 00:14:10.420 { 00:14:10.420 "name": null, 00:14:10.420 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:10.420 "is_configured": false, 00:14:10.420 "data_offset": 0, 00:14:10.420 "data_size": 65536 00:14:10.420 }, 00:14:10.420 { 00:14:10.420 "name": "BaseBdev3", 00:14:10.420 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:10.420 "is_configured": true, 00:14:10.420 "data_offset": 0, 00:14:10.420 "data_size": 65536 00:14:10.420 }, 00:14:10.420 { 00:14:10.420 "name": "BaseBdev4", 00:14:10.420 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:10.420 "is_configured": true, 00:14:10.420 "data_offset": 0, 00:14:10.420 "data_size": 65536 00:14:10.420 } 00:14:10.420 ] 00:14:10.420 }' 00:14:10.420 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:10.420 04:14:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.985 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.985 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:10.985 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:14:10.985 04:14:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:11.242 [2024-05-15 04:14:59.245427] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:11.242 BaseBdev1 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:11.499 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.757 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:12.014 [ 00:14:12.014 { 00:14:12.014 "name": "BaseBdev1", 00:14:12.014 "aliases": [ 00:14:12.014 "9805b924-8420-48bc-a2d7-8cbebf54550c" 00:14:12.014 ], 00:14:12.014 "product_name": "Malloc disk", 00:14:12.014 "block_size": 512, 00:14:12.014 "num_blocks": 65536, 00:14:12.014 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:12.014 "assigned_rate_limits": { 00:14:12.014 "rw_ios_per_sec": 0, 00:14:12.014 "rw_mbytes_per_sec": 0, 00:14:12.014 "r_mbytes_per_sec": 0, 00:14:12.014 "w_mbytes_per_sec": 0 00:14:12.014 }, 00:14:12.014 "claimed": true, 00:14:12.014 "claim_type": "exclusive_write", 00:14:12.014 "zoned": false, 00:14:12.014 "supported_io_types": { 00:14:12.014 "read": true, 00:14:12.014 "write": true, 00:14:12.014 "unmap": true, 00:14:12.014 "write_zeroes": true, 00:14:12.014 "flush": true, 00:14:12.014 "reset": true, 00:14:12.014 "compare": false, 00:14:12.014 "compare_and_write": false, 00:14:12.014 "abort": true, 00:14:12.014 "nvme_admin": false, 00:14:12.014 "nvme_io": false 00:14:12.014 }, 00:14:12.014 "memory_domains": [ 00:14:12.014 { 00:14:12.015 "dma_device_id": "system", 00:14:12.015 "dma_device_type": 1 00:14:12.015 }, 00:14:12.015 { 00:14:12.015 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.015 "dma_device_type": 2 00:14:12.015 } 00:14:12.015 ], 00:14:12.015 "driver_specific": {} 00:14:12.015 } 00:14:12.015 ] 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.015 04:14:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.273 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:12.273 "name": "Existed_Raid", 00:14:12.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:12.273 "strip_size_kb": 64, 00:14:12.273 "state": "configuring", 00:14:12.273 "raid_level": "raid0", 00:14:12.273 "superblock": false, 00:14:12.273 "num_base_bdevs": 4, 00:14:12.273 "num_base_bdevs_discovered": 3, 00:14:12.273 "num_base_bdevs_operational": 4, 00:14:12.273 "base_bdevs_list": [ 00:14:12.273 { 00:14:12.273 "name": "BaseBdev1", 00:14:12.273 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:12.273 "is_configured": true, 00:14:12.273 "data_offset": 0, 00:14:12.273 "data_size": 65536 00:14:12.273 }, 00:14:12.273 { 00:14:12.273 "name": null, 00:14:12.273 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:12.273 "is_configured": false, 00:14:12.273 "data_offset": 0, 00:14:12.273 "data_size": 65536 00:14:12.273 }, 00:14:12.273 { 00:14:12.273 "name": "BaseBdev3", 00:14:12.273 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:12.273 "is_configured": true, 00:14:12.273 "data_offset": 0, 00:14:12.273 "data_size": 65536 00:14:12.273 }, 00:14:12.273 { 00:14:12.273 "name": "BaseBdev4", 00:14:12.273 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:12.273 "is_configured": true, 00:14:12.273 "data_offset": 0, 00:14:12.273 "data_size": 65536 00:14:12.273 } 00:14:12.273 ] 00:14:12.273 }' 00:14:12.273 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:12.273 04:15:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.886 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.886 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:13.147 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:13.147 04:15:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:13.409 [2024-05-15 04:15:01.182583] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.409 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.666 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:13.666 "name": "Existed_Raid", 00:14:13.666 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.666 "strip_size_kb": 64, 00:14:13.666 "state": "configuring", 00:14:13.666 "raid_level": "raid0", 00:14:13.666 "superblock": false, 00:14:13.666 "num_base_bdevs": 4, 00:14:13.666 "num_base_bdevs_discovered": 2, 00:14:13.666 "num_base_bdevs_operational": 4, 00:14:13.666 "base_bdevs_list": [ 00:14:13.666 { 00:14:13.666 "name": "BaseBdev1", 00:14:13.666 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:13.666 "is_configured": true, 00:14:13.666 "data_offset": 0, 00:14:13.666 "data_size": 65536 00:14:13.666 }, 00:14:13.666 { 00:14:13.666 "name": null, 00:14:13.666 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:13.666 "is_configured": false, 00:14:13.666 "data_offset": 0, 00:14:13.666 "data_size": 65536 00:14:13.666 }, 00:14:13.666 { 00:14:13.666 "name": null, 00:14:13.666 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:13.666 "is_configured": false, 00:14:13.666 "data_offset": 0, 00:14:13.666 "data_size": 65536 00:14:13.666 }, 00:14:13.666 { 00:14:13.666 "name": "BaseBdev4", 00:14:13.666 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:13.666 "is_configured": true, 00:14:13.666 "data_offset": 0, 00:14:13.666 "data_size": 65536 00:14:13.666 } 00:14:13.666 ] 00:14:13.666 }' 00:14:13.666 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:13.666 04:15:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.230 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.230 04:15:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:14.230 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:14.230 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:14.508 [2024-05-15 04:15:02.474044] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.508 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.071 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:15.071 "name": "Existed_Raid", 00:14:15.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.071 "strip_size_kb": 64, 00:14:15.071 "state": "configuring", 00:14:15.071 "raid_level": "raid0", 00:14:15.071 "superblock": false, 00:14:15.071 "num_base_bdevs": 4, 00:14:15.071 "num_base_bdevs_discovered": 3, 00:14:15.071 "num_base_bdevs_operational": 4, 00:14:15.071 "base_bdevs_list": [ 00:14:15.071 { 00:14:15.071 "name": "BaseBdev1", 00:14:15.071 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:15.071 "is_configured": true, 00:14:15.071 "data_offset": 0, 00:14:15.071 "data_size": 65536 00:14:15.071 }, 00:14:15.071 { 00:14:15.071 "name": null, 00:14:15.071 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:15.071 "is_configured": false, 00:14:15.071 "data_offset": 0, 00:14:15.071 "data_size": 65536 00:14:15.071 }, 00:14:15.071 { 00:14:15.071 "name": "BaseBdev3", 00:14:15.071 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:15.071 "is_configured": true, 00:14:15.071 "data_offset": 0, 00:14:15.071 "data_size": 65536 00:14:15.071 }, 00:14:15.071 { 00:14:15.071 "name": "BaseBdev4", 00:14:15.071 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:15.071 "is_configured": true, 00:14:15.071 "data_offset": 0, 00:14:15.071 "data_size": 65536 00:14:15.071 } 00:14:15.071 ] 00:14:15.071 }' 00:14:15.071 04:15:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:15.071 04:15:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.640 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.640 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:15.640 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:15.640 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:15.898 [2024-05-15 04:15:03.901894] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.155 04:15:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:16.413 04:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:16.413 "name": "Existed_Raid", 00:14:16.413 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:16.413 "strip_size_kb": 64, 00:14:16.413 "state": "configuring", 00:14:16.413 "raid_level": "raid0", 00:14:16.413 "superblock": false, 00:14:16.413 "num_base_bdevs": 4, 00:14:16.413 "num_base_bdevs_discovered": 2, 00:14:16.413 "num_base_bdevs_operational": 4, 00:14:16.413 "base_bdevs_list": [ 00:14:16.413 { 00:14:16.413 "name": null, 00:14:16.413 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:16.413 "is_configured": false, 00:14:16.413 "data_offset": 0, 00:14:16.413 "data_size": 65536 00:14:16.413 }, 00:14:16.413 { 00:14:16.413 "name": null, 00:14:16.413 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:16.413 "is_configured": false, 00:14:16.413 "data_offset": 0, 00:14:16.413 "data_size": 65536 00:14:16.413 }, 00:14:16.413 { 00:14:16.413 "name": "BaseBdev3", 00:14:16.413 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:16.413 "is_configured": true, 00:14:16.413 "data_offset": 0, 00:14:16.413 "data_size": 65536 00:14:16.413 }, 00:14:16.413 { 00:14:16.413 "name": "BaseBdev4", 00:14:16.413 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:16.413 "is_configured": true, 00:14:16.413 "data_offset": 0, 00:14:16.413 "data_size": 65536 00:14:16.413 } 00:14:16.413 ] 00:14:16.413 }' 00:14:16.413 04:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:16.413 04:15:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.978 04:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.978 04:15:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:17.235 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:17.235 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:17.493 [2024-05-15 04:15:05.280090] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.493 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:17.751 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:17.751 "name": "Existed_Raid", 00:14:17.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:17.751 "strip_size_kb": 64, 00:14:17.751 "state": "configuring", 00:14:17.751 "raid_level": "raid0", 00:14:17.751 "superblock": false, 00:14:17.751 "num_base_bdevs": 4, 00:14:17.751 "num_base_bdevs_discovered": 3, 00:14:17.751 "num_base_bdevs_operational": 4, 00:14:17.751 "base_bdevs_list": [ 00:14:17.751 { 00:14:17.751 "name": null, 00:14:17.751 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:17.751 "is_configured": false, 00:14:17.751 "data_offset": 0, 00:14:17.751 "data_size": 65536 00:14:17.751 }, 00:14:17.751 { 00:14:17.751 "name": "BaseBdev2", 00:14:17.751 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:17.751 "is_configured": true, 00:14:17.751 "data_offset": 0, 00:14:17.751 "data_size": 65536 00:14:17.751 }, 00:14:17.751 { 00:14:17.751 "name": "BaseBdev3", 00:14:17.751 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:17.751 "is_configured": true, 00:14:17.751 "data_offset": 0, 00:14:17.751 "data_size": 65536 00:14:17.751 }, 00:14:17.751 { 00:14:17.751 "name": "BaseBdev4", 00:14:17.751 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:17.751 "is_configured": true, 00:14:17.751 "data_offset": 0, 00:14:17.751 "data_size": 65536 00:14:17.751 } 00:14:17.751 ] 00:14:17.751 }' 00:14:17.751 04:15:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:17.751 04:15:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.316 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.316 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:18.573 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:18.573 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.573 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9805b924-8420-48bc-a2d7-8cbebf54550c 00:14:18.831 [2024-05-15 04:15:06.830497] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:18.831 [2024-05-15 04:15:06.830557] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x197f340 00:14:18.831 [2024-05-15 04:15:06.830568] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:14:18.831 [2024-05-15 04:15:06.830780] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1981150 00:14:18.831 [2024-05-15 04:15:06.830958] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x197f340 00:14:18.831 [2024-05-15 04:15:06.830975] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x197f340 00:14:18.831 [2024-05-15 04:15:06.831201] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:18.831 NewBaseBdev 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:18.831 04:15:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:19.397 [ 00:14:19.397 { 00:14:19.397 "name": "NewBaseBdev", 00:14:19.397 "aliases": [ 00:14:19.397 "9805b924-8420-48bc-a2d7-8cbebf54550c" 00:14:19.397 ], 00:14:19.397 "product_name": "Malloc disk", 00:14:19.397 "block_size": 512, 00:14:19.397 "num_blocks": 65536, 00:14:19.397 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:19.397 "assigned_rate_limits": { 00:14:19.397 "rw_ios_per_sec": 0, 00:14:19.397 "rw_mbytes_per_sec": 0, 00:14:19.397 "r_mbytes_per_sec": 0, 00:14:19.397 "w_mbytes_per_sec": 0 00:14:19.397 }, 00:14:19.397 "claimed": true, 00:14:19.397 "claim_type": "exclusive_write", 00:14:19.397 "zoned": false, 00:14:19.397 "supported_io_types": { 00:14:19.397 "read": true, 00:14:19.397 "write": true, 00:14:19.397 "unmap": true, 00:14:19.397 "write_zeroes": true, 00:14:19.397 "flush": true, 00:14:19.397 "reset": true, 00:14:19.397 "compare": false, 00:14:19.397 "compare_and_write": false, 00:14:19.397 "abort": true, 00:14:19.397 "nvme_admin": false, 00:14:19.397 "nvme_io": false 00:14:19.397 }, 00:14:19.397 "memory_domains": [ 00:14:19.397 { 00:14:19.397 "dma_device_id": "system", 00:14:19.397 "dma_device_type": 1 00:14:19.397 }, 00:14:19.397 { 00:14:19.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.397 "dma_device_type": 2 00:14:19.397 } 00:14:19.397 ], 00:14:19.397 "driver_specific": {} 00:14:19.397 } 00:14:19.397 ] 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:19.397 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:19.962 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:19.963 "name": "Existed_Raid", 00:14:19.963 "uuid": "937be335-ae0c-4672-8032-4cc71ba29e95", 00:14:19.963 "strip_size_kb": 64, 00:14:19.963 "state": "online", 00:14:19.963 "raid_level": "raid0", 00:14:19.963 "superblock": false, 00:14:19.963 "num_base_bdevs": 4, 00:14:19.963 "num_base_bdevs_discovered": 4, 00:14:19.963 "num_base_bdevs_operational": 4, 00:14:19.963 "base_bdevs_list": [ 00:14:19.963 { 00:14:19.963 "name": "NewBaseBdev", 00:14:19.963 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:19.963 "is_configured": true, 00:14:19.963 "data_offset": 0, 00:14:19.963 "data_size": 65536 00:14:19.963 }, 00:14:19.963 { 00:14:19.963 "name": "BaseBdev2", 00:14:19.963 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:19.963 "is_configured": true, 00:14:19.963 "data_offset": 0, 00:14:19.963 "data_size": 65536 00:14:19.963 }, 00:14:19.963 { 00:14:19.963 "name": "BaseBdev3", 00:14:19.963 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:19.963 "is_configured": true, 00:14:19.963 "data_offset": 0, 00:14:19.963 "data_size": 65536 00:14:19.963 }, 00:14:19.963 { 00:14:19.963 "name": "BaseBdev4", 00:14:19.963 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:19.963 "is_configured": true, 00:14:19.963 "data_offset": 0, 00:14:19.963 "data_size": 65536 00:14:19.963 } 00:14:19.963 ] 00:14:19.963 }' 00:14:19.963 04:15:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:19.963 04:15:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:20.220 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:20.478 [2024-05-15 04:15:08.451101] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.478 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:20.478 "name": "Existed_Raid", 00:14:20.478 "aliases": [ 00:14:20.478 "937be335-ae0c-4672-8032-4cc71ba29e95" 00:14:20.478 ], 00:14:20.478 "product_name": "Raid Volume", 00:14:20.478 "block_size": 512, 00:14:20.478 "num_blocks": 262144, 00:14:20.478 "uuid": "937be335-ae0c-4672-8032-4cc71ba29e95", 00:14:20.478 "assigned_rate_limits": { 00:14:20.478 "rw_ios_per_sec": 0, 00:14:20.478 "rw_mbytes_per_sec": 0, 00:14:20.478 "r_mbytes_per_sec": 0, 00:14:20.478 "w_mbytes_per_sec": 0 00:14:20.478 }, 00:14:20.478 "claimed": false, 00:14:20.478 "zoned": false, 00:14:20.478 "supported_io_types": { 00:14:20.478 "read": true, 00:14:20.478 "write": true, 00:14:20.478 "unmap": true, 00:14:20.478 "write_zeroes": true, 00:14:20.478 "flush": true, 00:14:20.478 "reset": true, 00:14:20.478 "compare": false, 00:14:20.478 "compare_and_write": false, 00:14:20.478 "abort": false, 00:14:20.478 "nvme_admin": false, 00:14:20.478 "nvme_io": false 00:14:20.478 }, 00:14:20.478 "memory_domains": [ 00:14:20.478 { 00:14:20.478 "dma_device_id": "system", 00:14:20.479 "dma_device_type": 1 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.479 "dma_device_type": 2 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "system", 00:14:20.479 "dma_device_type": 1 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.479 "dma_device_type": 2 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "system", 00:14:20.479 "dma_device_type": 1 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.479 "dma_device_type": 2 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "system", 00:14:20.479 "dma_device_type": 1 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.479 "dma_device_type": 2 00:14:20.479 } 00:14:20.479 ], 00:14:20.479 "driver_specific": { 00:14:20.479 "raid": { 00:14:20.479 "uuid": "937be335-ae0c-4672-8032-4cc71ba29e95", 00:14:20.479 "strip_size_kb": 64, 00:14:20.479 "state": "online", 00:14:20.479 "raid_level": "raid0", 00:14:20.479 "superblock": false, 00:14:20.479 "num_base_bdevs": 4, 00:14:20.479 "num_base_bdevs_discovered": 4, 00:14:20.479 "num_base_bdevs_operational": 4, 00:14:20.479 "base_bdevs_list": [ 00:14:20.479 { 00:14:20.479 "name": "NewBaseBdev", 00:14:20.479 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:20.479 "is_configured": true, 00:14:20.479 "data_offset": 0, 00:14:20.479 "data_size": 65536 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "name": "BaseBdev2", 00:14:20.479 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:20.479 "is_configured": true, 00:14:20.479 "data_offset": 0, 00:14:20.479 "data_size": 65536 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "name": "BaseBdev3", 00:14:20.479 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:20.479 "is_configured": true, 00:14:20.479 "data_offset": 0, 00:14:20.479 "data_size": 65536 00:14:20.479 }, 00:14:20.479 { 00:14:20.479 "name": "BaseBdev4", 00:14:20.479 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:20.479 "is_configured": true, 00:14:20.479 "data_offset": 0, 00:14:20.479 "data_size": 65536 00:14:20.479 } 00:14:20.479 ] 00:14:20.479 } 00:14:20.479 } 00:14:20.479 }' 00:14:20.479 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:20.736 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:20.736 BaseBdev2 00:14:20.736 BaseBdev3 00:14:20.736 BaseBdev4' 00:14:20.736 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:20.736 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:20.736 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:20.994 "name": "NewBaseBdev", 00:14:20.994 "aliases": [ 00:14:20.994 "9805b924-8420-48bc-a2d7-8cbebf54550c" 00:14:20.994 ], 00:14:20.994 "product_name": "Malloc disk", 00:14:20.994 "block_size": 512, 00:14:20.994 "num_blocks": 65536, 00:14:20.994 "uuid": "9805b924-8420-48bc-a2d7-8cbebf54550c", 00:14:20.994 "assigned_rate_limits": { 00:14:20.994 "rw_ios_per_sec": 0, 00:14:20.994 "rw_mbytes_per_sec": 0, 00:14:20.994 "r_mbytes_per_sec": 0, 00:14:20.994 "w_mbytes_per_sec": 0 00:14:20.994 }, 00:14:20.994 "claimed": true, 00:14:20.994 "claim_type": "exclusive_write", 00:14:20.994 "zoned": false, 00:14:20.994 "supported_io_types": { 00:14:20.994 "read": true, 00:14:20.994 "write": true, 00:14:20.994 "unmap": true, 00:14:20.994 "write_zeroes": true, 00:14:20.994 "flush": true, 00:14:20.994 "reset": true, 00:14:20.994 "compare": false, 00:14:20.994 "compare_and_write": false, 00:14:20.994 "abort": true, 00:14:20.994 "nvme_admin": false, 00:14:20.994 "nvme_io": false 00:14:20.994 }, 00:14:20.994 "memory_domains": [ 00:14:20.994 { 00:14:20.994 "dma_device_id": "system", 00:14:20.994 "dma_device_type": 1 00:14:20.994 }, 00:14:20.994 { 00:14:20.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.994 "dma_device_type": 2 00:14:20.994 } 00:14:20.994 ], 00:14:20.994 "driver_specific": {} 00:14:20.994 }' 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.994 04:15:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.252 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.252 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:21.252 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:21.252 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:21.252 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:21.509 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:21.509 "name": "BaseBdev2", 00:14:21.509 "aliases": [ 00:14:21.509 "014d67ea-f6ed-4fb9-9278-57210753401d" 00:14:21.509 ], 00:14:21.509 "product_name": "Malloc disk", 00:14:21.509 "block_size": 512, 00:14:21.509 "num_blocks": 65536, 00:14:21.509 "uuid": "014d67ea-f6ed-4fb9-9278-57210753401d", 00:14:21.509 "assigned_rate_limits": { 00:14:21.509 "rw_ios_per_sec": 0, 00:14:21.509 "rw_mbytes_per_sec": 0, 00:14:21.509 "r_mbytes_per_sec": 0, 00:14:21.509 "w_mbytes_per_sec": 0 00:14:21.509 }, 00:14:21.509 "claimed": true, 00:14:21.509 "claim_type": "exclusive_write", 00:14:21.509 "zoned": false, 00:14:21.509 "supported_io_types": { 00:14:21.509 "read": true, 00:14:21.509 "write": true, 00:14:21.509 "unmap": true, 00:14:21.509 "write_zeroes": true, 00:14:21.509 "flush": true, 00:14:21.509 "reset": true, 00:14:21.509 "compare": false, 00:14:21.509 "compare_and_write": false, 00:14:21.509 "abort": true, 00:14:21.509 "nvme_admin": false, 00:14:21.510 "nvme_io": false 00:14:21.510 }, 00:14:21.510 "memory_domains": [ 00:14:21.510 { 00:14:21.510 "dma_device_id": "system", 00:14:21.510 "dma_device_type": 1 00:14:21.510 }, 00:14:21.510 { 00:14:21.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:21.510 "dma_device_type": 2 00:14:21.510 } 00:14:21.510 ], 00:14:21.510 "driver_specific": {} 00:14:21.510 }' 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:21.510 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.767 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:21.767 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:21.767 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:21.767 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:21.767 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:22.024 "name": "BaseBdev3", 00:14:22.024 "aliases": [ 00:14:22.024 "06dc4ca3-5357-4d72-b83a-970a65e7bd0c" 00:14:22.024 ], 00:14:22.024 "product_name": "Malloc disk", 00:14:22.024 "block_size": 512, 00:14:22.024 "num_blocks": 65536, 00:14:22.024 "uuid": "06dc4ca3-5357-4d72-b83a-970a65e7bd0c", 00:14:22.024 "assigned_rate_limits": { 00:14:22.024 "rw_ios_per_sec": 0, 00:14:22.024 "rw_mbytes_per_sec": 0, 00:14:22.024 "r_mbytes_per_sec": 0, 00:14:22.024 "w_mbytes_per_sec": 0 00:14:22.024 }, 00:14:22.024 "claimed": true, 00:14:22.024 "claim_type": "exclusive_write", 00:14:22.024 "zoned": false, 00:14:22.024 "supported_io_types": { 00:14:22.024 "read": true, 00:14:22.024 "write": true, 00:14:22.024 "unmap": true, 00:14:22.024 "write_zeroes": true, 00:14:22.024 "flush": true, 00:14:22.024 "reset": true, 00:14:22.024 "compare": false, 00:14:22.024 "compare_and_write": false, 00:14:22.024 "abort": true, 00:14:22.024 "nvme_admin": false, 00:14:22.024 "nvme_io": false 00:14:22.024 }, 00:14:22.024 "memory_domains": [ 00:14:22.024 { 00:14:22.024 "dma_device_id": "system", 00:14:22.024 "dma_device_type": 1 00:14:22.024 }, 00:14:22.024 { 00:14:22.024 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.024 "dma_device_type": 2 00:14:22.024 } 00:14:22.024 ], 00:14:22.024 "driver_specific": {} 00:14:22.024 }' 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.024 04:15:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.024 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.024 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:22.281 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:22.539 "name": "BaseBdev4", 00:14:22.539 "aliases": [ 00:14:22.539 "e86eb3de-101f-45b6-a852-9fc9720d563a" 00:14:22.539 ], 00:14:22.539 "product_name": "Malloc disk", 00:14:22.539 "block_size": 512, 00:14:22.539 "num_blocks": 65536, 00:14:22.539 "uuid": "e86eb3de-101f-45b6-a852-9fc9720d563a", 00:14:22.539 "assigned_rate_limits": { 00:14:22.539 "rw_ios_per_sec": 0, 00:14:22.539 "rw_mbytes_per_sec": 0, 00:14:22.539 "r_mbytes_per_sec": 0, 00:14:22.539 "w_mbytes_per_sec": 0 00:14:22.539 }, 00:14:22.539 "claimed": true, 00:14:22.539 "claim_type": "exclusive_write", 00:14:22.539 "zoned": false, 00:14:22.539 "supported_io_types": { 00:14:22.539 "read": true, 00:14:22.539 "write": true, 00:14:22.539 "unmap": true, 00:14:22.539 "write_zeroes": true, 00:14:22.539 "flush": true, 00:14:22.539 "reset": true, 00:14:22.539 "compare": false, 00:14:22.539 "compare_and_write": false, 00:14:22.539 "abort": true, 00:14:22.539 "nvme_admin": false, 00:14:22.539 "nvme_io": false 00:14:22.539 }, 00:14:22.539 "memory_domains": [ 00:14:22.539 { 00:14:22.539 "dma_device_id": "system", 00:14:22.539 "dma_device_type": 1 00:14:22.539 }, 00:14:22.539 { 00:14:22.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:22.539 "dma_device_type": 2 00:14:22.539 } 00:14:22.539 ], 00:14:22.539 "driver_specific": {} 00:14:22.539 }' 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.539 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.796 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:22.797 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:22.797 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:23.055 [2024-05-15 04:15:10.917430] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:23.055 [2024-05-15 04:15:10.917458] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:23.055 [2024-05-15 04:15:10.917546] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.055 [2024-05-15 04:15:10.917620] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:23.055 [2024-05-15 04:15:10.917637] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x197f340 name Existed_Raid, state offline 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3869381 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3869381 ']' 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3869381 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3869381 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3869381' 00:14:23.055 killing process with pid 3869381 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3869381 00:14:23.055 [2024-05-15 04:15:10.965267] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:23.055 04:15:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3869381 00:14:23.055 [2024-05-15 04:15:11.012342] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:23.313 04:15:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:14:23.313 00:14:23.313 real 0m32.163s 00:14:23.313 user 1m0.446s 00:14:23.313 sys 0m4.320s 00:14:23.313 04:15:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:23.313 04:15:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.313 ************************************ 00:14:23.313 END TEST raid_state_function_test 00:14:23.313 ************************************ 00:14:23.313 04:15:11 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:14:23.313 04:15:11 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:14:23.313 04:15:11 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:23.313 04:15:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:23.571 ************************************ 00:14:23.571 START TEST raid_state_function_test_sb 00:14:23.571 ************************************ 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid0 4 true 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid0 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid0 '!=' raid1 ']' 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3874520 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3874520' 00:14:23.571 Process raid pid: 3874520 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3874520 /var/tmp/spdk-raid.sock 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3874520 ']' 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:23.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:23.571 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.571 [2024-05-15 04:15:11.410084] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:14:23.571 [2024-05-15 04:15:11.410177] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:23.571 [2024-05-15 04:15:11.495464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:23.829 [2024-05-15 04:15:11.620097] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:23.829 [2024-05-15 04:15:11.697781] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.829 [2024-05-15 04:15:11.697830] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:23.829 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:23.829 04:15:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:14:23.829 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:24.086 [2024-05-15 04:15:11.986732] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:24.086 [2024-05-15 04:15:11.986771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:24.086 [2024-05-15 04:15:11.986782] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:24.086 [2024-05-15 04:15:11.986792] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:24.086 [2024-05-15 04:15:11.986799] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:24.086 [2024-05-15 04:15:11.986809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:24.086 [2024-05-15 04:15:11.986816] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:24.086 [2024-05-15 04:15:11.986875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:24.086 04:15:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:24.086 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:24.086 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:24.086 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.086 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.344 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:24.344 "name": "Existed_Raid", 00:14:24.344 "uuid": "353f74d9-1a95-4ef0-a3b5-7288fd81bad4", 00:14:24.344 "strip_size_kb": 64, 00:14:24.344 "state": "configuring", 00:14:24.344 "raid_level": "raid0", 00:14:24.344 "superblock": true, 00:14:24.344 "num_base_bdevs": 4, 00:14:24.344 "num_base_bdevs_discovered": 0, 00:14:24.344 "num_base_bdevs_operational": 4, 00:14:24.344 "base_bdevs_list": [ 00:14:24.344 { 00:14:24.344 "name": "BaseBdev1", 00:14:24.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.344 "is_configured": false, 00:14:24.344 "data_offset": 0, 00:14:24.344 "data_size": 0 00:14:24.344 }, 00:14:24.344 { 00:14:24.344 "name": "BaseBdev2", 00:14:24.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.344 "is_configured": false, 00:14:24.344 "data_offset": 0, 00:14:24.344 "data_size": 0 00:14:24.344 }, 00:14:24.344 { 00:14:24.344 "name": "BaseBdev3", 00:14:24.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.344 "is_configured": false, 00:14:24.344 "data_offset": 0, 00:14:24.344 "data_size": 0 00:14:24.344 }, 00:14:24.344 { 00:14:24.344 "name": "BaseBdev4", 00:14:24.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.344 "is_configured": false, 00:14:24.344 "data_offset": 0, 00:14:24.344 "data_size": 0 00:14:24.344 } 00:14:24.344 ] 00:14:24.344 }' 00:14:24.344 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:24.344 04:15:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.908 04:15:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:25.165 [2024-05-15 04:15:13.025378] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:25.165 [2024-05-15 04:15:13.025408] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b0040 name Existed_Raid, state configuring 00:14:25.165 04:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:25.422 [2024-05-15 04:15:13.282089] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:25.422 [2024-05-15 04:15:13.282135] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:25.422 [2024-05-15 04:15:13.282147] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:25.422 [2024-05-15 04:15:13.282159] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:25.422 [2024-05-15 04:15:13.282169] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:25.422 [2024-05-15 04:15:13.282187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:25.422 [2024-05-15 04:15:13.282196] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:25.422 [2024-05-15 04:15:13.282208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:25.422 04:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:25.680 [2024-05-15 04:15:13.525362] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:25.680 BaseBdev1 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:25.680 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:25.938 04:15:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:26.196 [ 00:14:26.196 { 00:14:26.196 "name": "BaseBdev1", 00:14:26.196 "aliases": [ 00:14:26.196 "e34e3057-6089-415f-bd80-f38811cef0fc" 00:14:26.196 ], 00:14:26.196 "product_name": "Malloc disk", 00:14:26.196 "block_size": 512, 00:14:26.196 "num_blocks": 65536, 00:14:26.196 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:26.196 "assigned_rate_limits": { 00:14:26.196 "rw_ios_per_sec": 0, 00:14:26.196 "rw_mbytes_per_sec": 0, 00:14:26.196 "r_mbytes_per_sec": 0, 00:14:26.196 "w_mbytes_per_sec": 0 00:14:26.196 }, 00:14:26.196 "claimed": true, 00:14:26.196 "claim_type": "exclusive_write", 00:14:26.196 "zoned": false, 00:14:26.196 "supported_io_types": { 00:14:26.196 "read": true, 00:14:26.196 "write": true, 00:14:26.196 "unmap": true, 00:14:26.196 "write_zeroes": true, 00:14:26.196 "flush": true, 00:14:26.196 "reset": true, 00:14:26.196 "compare": false, 00:14:26.196 "compare_and_write": false, 00:14:26.196 "abort": true, 00:14:26.196 "nvme_admin": false, 00:14:26.196 "nvme_io": false 00:14:26.196 }, 00:14:26.196 "memory_domains": [ 00:14:26.196 { 00:14:26.196 "dma_device_id": "system", 00:14:26.196 "dma_device_type": 1 00:14:26.196 }, 00:14:26.196 { 00:14:26.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.196 "dma_device_type": 2 00:14:26.196 } 00:14:26.196 ], 00:14:26.196 "driver_specific": {} 00:14:26.196 } 00:14:26.196 ] 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.196 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.454 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:26.454 "name": "Existed_Raid", 00:14:26.454 "uuid": "9f4b7e2b-5ab4-4214-8fc7-38aa363a43b4", 00:14:26.454 "strip_size_kb": 64, 00:14:26.454 "state": "configuring", 00:14:26.454 "raid_level": "raid0", 00:14:26.454 "superblock": true, 00:14:26.454 "num_base_bdevs": 4, 00:14:26.454 "num_base_bdevs_discovered": 1, 00:14:26.454 "num_base_bdevs_operational": 4, 00:14:26.454 "base_bdevs_list": [ 00:14:26.454 { 00:14:26.454 "name": "BaseBdev1", 00:14:26.454 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:26.454 "is_configured": true, 00:14:26.454 "data_offset": 2048, 00:14:26.454 "data_size": 63488 00:14:26.454 }, 00:14:26.454 { 00:14:26.454 "name": "BaseBdev2", 00:14:26.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.454 "is_configured": false, 00:14:26.454 "data_offset": 0, 00:14:26.454 "data_size": 0 00:14:26.454 }, 00:14:26.454 { 00:14:26.454 "name": "BaseBdev3", 00:14:26.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.454 "is_configured": false, 00:14:26.454 "data_offset": 0, 00:14:26.454 "data_size": 0 00:14:26.454 }, 00:14:26.455 { 00:14:26.455 "name": "BaseBdev4", 00:14:26.455 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.455 "is_configured": false, 00:14:26.455 "data_offset": 0, 00:14:26.455 "data_size": 0 00:14:26.455 } 00:14:26.455 ] 00:14:26.455 }' 00:14:26.455 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:26.455 04:15:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.020 04:15:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:27.278 [2024-05-15 04:15:15.049381] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:27.278 [2024-05-15 04:15:15.049428] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12af8b0 name Existed_Raid, state configuring 00:14:27.278 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:27.278 [2024-05-15 04:15:15.286060] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:27.278 [2024-05-15 04:15:15.287597] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:27.278 [2024-05-15 04:15:15.287633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:27.278 [2024-05-15 04:15:15.287653] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:27.278 [2024-05-15 04:15:15.287666] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:27.278 [2024-05-15 04:15:15.287675] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:14:27.278 [2024-05-15 04:15:15.287688] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.537 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:27.794 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:27.794 "name": "Existed_Raid", 00:14:27.794 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:27.794 "strip_size_kb": 64, 00:14:27.794 "state": "configuring", 00:14:27.794 "raid_level": "raid0", 00:14:27.794 "superblock": true, 00:14:27.794 "num_base_bdevs": 4, 00:14:27.794 "num_base_bdevs_discovered": 1, 00:14:27.794 "num_base_bdevs_operational": 4, 00:14:27.794 "base_bdevs_list": [ 00:14:27.794 { 00:14:27.794 "name": "BaseBdev1", 00:14:27.794 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:27.794 "is_configured": true, 00:14:27.794 "data_offset": 2048, 00:14:27.794 "data_size": 63488 00:14:27.794 }, 00:14:27.794 { 00:14:27.794 "name": "BaseBdev2", 00:14:27.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.794 "is_configured": false, 00:14:27.794 "data_offset": 0, 00:14:27.794 "data_size": 0 00:14:27.794 }, 00:14:27.794 { 00:14:27.794 "name": "BaseBdev3", 00:14:27.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.794 "is_configured": false, 00:14:27.794 "data_offset": 0, 00:14:27.794 "data_size": 0 00:14:27.794 }, 00:14:27.794 { 00:14:27.794 "name": "BaseBdev4", 00:14:27.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:27.794 "is_configured": false, 00:14:27.794 "data_offset": 0, 00:14:27.794 "data_size": 0 00:14:27.794 } 00:14:27.794 ] 00:14:27.794 }' 00:14:27.794 04:15:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:27.794 04:15:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:28.358 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:28.358 [2024-05-15 04:15:16.365653] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:28.358 BaseBdev2 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:28.615 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:28.872 [ 00:14:28.872 { 00:14:28.872 "name": "BaseBdev2", 00:14:28.872 "aliases": [ 00:14:28.872 "97bc9e38-aa00-4d39-a666-a0d869d3c59d" 00:14:28.872 ], 00:14:28.872 "product_name": "Malloc disk", 00:14:28.872 "block_size": 512, 00:14:28.872 "num_blocks": 65536, 00:14:28.872 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:28.872 "assigned_rate_limits": { 00:14:28.872 "rw_ios_per_sec": 0, 00:14:28.872 "rw_mbytes_per_sec": 0, 00:14:28.872 "r_mbytes_per_sec": 0, 00:14:28.872 "w_mbytes_per_sec": 0 00:14:28.872 }, 00:14:28.872 "claimed": true, 00:14:28.872 "claim_type": "exclusive_write", 00:14:28.872 "zoned": false, 00:14:28.872 "supported_io_types": { 00:14:28.872 "read": true, 00:14:28.872 "write": true, 00:14:28.872 "unmap": true, 00:14:28.872 "write_zeroes": true, 00:14:28.872 "flush": true, 00:14:28.872 "reset": true, 00:14:28.872 "compare": false, 00:14:28.872 "compare_and_write": false, 00:14:28.872 "abort": true, 00:14:28.872 "nvme_admin": false, 00:14:28.872 "nvme_io": false 00:14:28.872 }, 00:14:28.872 "memory_domains": [ 00:14:28.872 { 00:14:28.872 "dma_device_id": "system", 00:14:28.872 "dma_device_type": 1 00:14:28.872 }, 00:14:28.872 { 00:14:28.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.872 "dma_device_type": 2 00:14:28.872 } 00:14:28.872 ], 00:14:28.872 "driver_specific": {} 00:14:28.872 } 00:14:28.872 ] 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:28.872 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:28.873 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:28.873 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:28.873 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:28.873 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:29.130 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.130 04:15:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:29.130 04:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:29.130 "name": "Existed_Raid", 00:14:29.130 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:29.130 "strip_size_kb": 64, 00:14:29.130 "state": "configuring", 00:14:29.130 "raid_level": "raid0", 00:14:29.130 "superblock": true, 00:14:29.130 "num_base_bdevs": 4, 00:14:29.130 "num_base_bdevs_discovered": 2, 00:14:29.130 "num_base_bdevs_operational": 4, 00:14:29.130 "base_bdevs_list": [ 00:14:29.130 { 00:14:29.130 "name": "BaseBdev1", 00:14:29.130 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:29.130 "is_configured": true, 00:14:29.130 "data_offset": 2048, 00:14:29.130 "data_size": 63488 00:14:29.130 }, 00:14:29.130 { 00:14:29.130 "name": "BaseBdev2", 00:14:29.130 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:29.130 "is_configured": true, 00:14:29.130 "data_offset": 2048, 00:14:29.130 "data_size": 63488 00:14:29.130 }, 00:14:29.130 { 00:14:29.130 "name": "BaseBdev3", 00:14:29.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.130 "is_configured": false, 00:14:29.130 "data_offset": 0, 00:14:29.130 "data_size": 0 00:14:29.130 }, 00:14:29.130 { 00:14:29.130 "name": "BaseBdev4", 00:14:29.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:29.130 "is_configured": false, 00:14:29.130 "data_offset": 0, 00:14:29.130 "data_size": 0 00:14:29.130 } 00:14:29.130 ] 00:14:29.130 }' 00:14:29.130 04:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:29.130 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:29.695 04:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:29.953 [2024-05-15 04:15:17.943594] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:29.953 BaseBdev3 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:29.953 04:15:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:30.517 04:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:30.517 [ 00:14:30.517 { 00:14:30.517 "name": "BaseBdev3", 00:14:30.517 "aliases": [ 00:14:30.517 "4a74d964-ac0d-4ace-8647-55614eb972be" 00:14:30.517 ], 00:14:30.517 "product_name": "Malloc disk", 00:14:30.517 "block_size": 512, 00:14:30.517 "num_blocks": 65536, 00:14:30.517 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:30.517 "assigned_rate_limits": { 00:14:30.517 "rw_ios_per_sec": 0, 00:14:30.517 "rw_mbytes_per_sec": 0, 00:14:30.517 "r_mbytes_per_sec": 0, 00:14:30.517 "w_mbytes_per_sec": 0 00:14:30.517 }, 00:14:30.517 "claimed": true, 00:14:30.517 "claim_type": "exclusive_write", 00:14:30.517 "zoned": false, 00:14:30.517 "supported_io_types": { 00:14:30.517 "read": true, 00:14:30.517 "write": true, 00:14:30.517 "unmap": true, 00:14:30.517 "write_zeroes": true, 00:14:30.517 "flush": true, 00:14:30.517 "reset": true, 00:14:30.517 "compare": false, 00:14:30.517 "compare_and_write": false, 00:14:30.517 "abort": true, 00:14:30.517 "nvme_admin": false, 00:14:30.517 "nvme_io": false 00:14:30.517 }, 00:14:30.517 "memory_domains": [ 00:14:30.517 { 00:14:30.517 "dma_device_id": "system", 00:14:30.517 "dma_device_type": 1 00:14:30.517 }, 00:14:30.517 { 00:14:30.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.517 "dma_device_type": 2 00:14:30.517 } 00:14:30.517 ], 00:14:30.517 "driver_specific": {} 00:14:30.517 } 00:14:30.517 ] 00:14:30.517 04:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:30.517 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:30.775 "name": "Existed_Raid", 00:14:30.775 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:30.775 "strip_size_kb": 64, 00:14:30.775 "state": "configuring", 00:14:30.775 "raid_level": "raid0", 00:14:30.775 "superblock": true, 00:14:30.775 "num_base_bdevs": 4, 00:14:30.775 "num_base_bdevs_discovered": 3, 00:14:30.775 "num_base_bdevs_operational": 4, 00:14:30.775 "base_bdevs_list": [ 00:14:30.775 { 00:14:30.775 "name": "BaseBdev1", 00:14:30.775 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:30.775 "is_configured": true, 00:14:30.775 "data_offset": 2048, 00:14:30.775 "data_size": 63488 00:14:30.775 }, 00:14:30.775 { 00:14:30.775 "name": "BaseBdev2", 00:14:30.775 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:30.775 "is_configured": true, 00:14:30.775 "data_offset": 2048, 00:14:30.775 "data_size": 63488 00:14:30.775 }, 00:14:30.775 { 00:14:30.775 "name": "BaseBdev3", 00:14:30.775 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:30.775 "is_configured": true, 00:14:30.775 "data_offset": 2048, 00:14:30.775 "data_size": 63488 00:14:30.775 }, 00:14:30.775 { 00:14:30.775 "name": "BaseBdev4", 00:14:30.775 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:30.775 "is_configured": false, 00:14:30.775 "data_offset": 0, 00:14:30.775 "data_size": 0 00:14:30.775 } 00:14:30.775 ] 00:14:30.775 }' 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:30.775 04:15:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:31.340 04:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:31.598 [2024-05-15 04:15:19.564162] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:31.598 [2024-05-15 04:15:19.564377] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x12b07f0 00:14:31.598 [2024-05-15 04:15:19.564392] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:31.598 [2024-05-15 04:15:19.564529] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14645f0 00:14:31.598 [2024-05-15 04:15:19.564652] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12b07f0 00:14:31.598 [2024-05-15 04:15:19.564665] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12b07f0 00:14:31.598 [2024-05-15 04:15:19.564751] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.598 BaseBdev4 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:31.598 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:31.856 04:15:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:32.113 [ 00:14:32.113 { 00:14:32.113 "name": "BaseBdev4", 00:14:32.113 "aliases": [ 00:14:32.113 "b883e9b8-321c-4348-8fe4-91098633d4f6" 00:14:32.113 ], 00:14:32.113 "product_name": "Malloc disk", 00:14:32.113 "block_size": 512, 00:14:32.113 "num_blocks": 65536, 00:14:32.113 "uuid": "b883e9b8-321c-4348-8fe4-91098633d4f6", 00:14:32.113 "assigned_rate_limits": { 00:14:32.113 "rw_ios_per_sec": 0, 00:14:32.113 "rw_mbytes_per_sec": 0, 00:14:32.113 "r_mbytes_per_sec": 0, 00:14:32.113 "w_mbytes_per_sec": 0 00:14:32.113 }, 00:14:32.113 "claimed": true, 00:14:32.113 "claim_type": "exclusive_write", 00:14:32.113 "zoned": false, 00:14:32.113 "supported_io_types": { 00:14:32.113 "read": true, 00:14:32.113 "write": true, 00:14:32.113 "unmap": true, 00:14:32.113 "write_zeroes": true, 00:14:32.113 "flush": true, 00:14:32.113 "reset": true, 00:14:32.113 "compare": false, 00:14:32.113 "compare_and_write": false, 00:14:32.113 "abort": true, 00:14:32.113 "nvme_admin": false, 00:14:32.113 "nvme_io": false 00:14:32.113 }, 00:14:32.113 "memory_domains": [ 00:14:32.113 { 00:14:32.113 "dma_device_id": "system", 00:14:32.113 "dma_device_type": 1 00:14:32.113 }, 00:14:32.113 { 00:14:32.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.113 "dma_device_type": 2 00:14:32.113 } 00:14:32.113 ], 00:14:32.113 "driver_specific": {} 00:14:32.113 } 00:14:32.113 ] 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.114 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:32.371 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:32.371 "name": "Existed_Raid", 00:14:32.371 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:32.371 "strip_size_kb": 64, 00:14:32.371 "state": "online", 00:14:32.371 "raid_level": "raid0", 00:14:32.371 "superblock": true, 00:14:32.371 "num_base_bdevs": 4, 00:14:32.371 "num_base_bdevs_discovered": 4, 00:14:32.371 "num_base_bdevs_operational": 4, 00:14:32.371 "base_bdevs_list": [ 00:14:32.371 { 00:14:32.371 "name": "BaseBdev1", 00:14:32.371 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:32.371 "is_configured": true, 00:14:32.371 "data_offset": 2048, 00:14:32.371 "data_size": 63488 00:14:32.371 }, 00:14:32.371 { 00:14:32.371 "name": "BaseBdev2", 00:14:32.371 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:32.371 "is_configured": true, 00:14:32.371 "data_offset": 2048, 00:14:32.371 "data_size": 63488 00:14:32.371 }, 00:14:32.371 { 00:14:32.371 "name": "BaseBdev3", 00:14:32.371 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:32.371 "is_configured": true, 00:14:32.371 "data_offset": 2048, 00:14:32.371 "data_size": 63488 00:14:32.371 }, 00:14:32.371 { 00:14:32.371 "name": "BaseBdev4", 00:14:32.371 "uuid": "b883e9b8-321c-4348-8fe4-91098633d4f6", 00:14:32.371 "is_configured": true, 00:14:32.371 "data_offset": 2048, 00:14:32.371 "data_size": 63488 00:14:32.371 } 00:14:32.371 ] 00:14:32.371 }' 00:14:32.371 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:32.371 04:15:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:32.950 04:15:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:33.208 [2024-05-15 04:15:21.140655] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:33.208 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:33.208 "name": "Existed_Raid", 00:14:33.208 "aliases": [ 00:14:33.208 "1fb43748-b608-40c3-a81f-3cce31fc4a9e" 00:14:33.208 ], 00:14:33.208 "product_name": "Raid Volume", 00:14:33.208 "block_size": 512, 00:14:33.208 "num_blocks": 253952, 00:14:33.208 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:33.208 "assigned_rate_limits": { 00:14:33.208 "rw_ios_per_sec": 0, 00:14:33.208 "rw_mbytes_per_sec": 0, 00:14:33.208 "r_mbytes_per_sec": 0, 00:14:33.208 "w_mbytes_per_sec": 0 00:14:33.208 }, 00:14:33.208 "claimed": false, 00:14:33.208 "zoned": false, 00:14:33.208 "supported_io_types": { 00:14:33.208 "read": true, 00:14:33.208 "write": true, 00:14:33.208 "unmap": true, 00:14:33.208 "write_zeroes": true, 00:14:33.208 "flush": true, 00:14:33.208 "reset": true, 00:14:33.208 "compare": false, 00:14:33.208 "compare_and_write": false, 00:14:33.208 "abort": false, 00:14:33.208 "nvme_admin": false, 00:14:33.208 "nvme_io": false 00:14:33.208 }, 00:14:33.208 "memory_domains": [ 00:14:33.208 { 00:14:33.208 "dma_device_id": "system", 00:14:33.208 "dma_device_type": 1 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.208 "dma_device_type": 2 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "system", 00:14:33.208 "dma_device_type": 1 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.208 "dma_device_type": 2 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "system", 00:14:33.208 "dma_device_type": 1 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.208 "dma_device_type": 2 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "system", 00:14:33.208 "dma_device_type": 1 00:14:33.208 }, 00:14:33.208 { 00:14:33.208 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.208 "dma_device_type": 2 00:14:33.208 } 00:14:33.208 ], 00:14:33.208 "driver_specific": { 00:14:33.208 "raid": { 00:14:33.208 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:33.208 "strip_size_kb": 64, 00:14:33.208 "state": "online", 00:14:33.209 "raid_level": "raid0", 00:14:33.209 "superblock": true, 00:14:33.209 "num_base_bdevs": 4, 00:14:33.209 "num_base_bdevs_discovered": 4, 00:14:33.209 "num_base_bdevs_operational": 4, 00:14:33.209 "base_bdevs_list": [ 00:14:33.209 { 00:14:33.209 "name": "BaseBdev1", 00:14:33.209 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:33.209 "is_configured": true, 00:14:33.209 "data_offset": 2048, 00:14:33.209 "data_size": 63488 00:14:33.209 }, 00:14:33.209 { 00:14:33.209 "name": "BaseBdev2", 00:14:33.209 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:33.209 "is_configured": true, 00:14:33.209 "data_offset": 2048, 00:14:33.209 "data_size": 63488 00:14:33.209 }, 00:14:33.209 { 00:14:33.209 "name": "BaseBdev3", 00:14:33.209 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:33.209 "is_configured": true, 00:14:33.209 "data_offset": 2048, 00:14:33.209 "data_size": 63488 00:14:33.209 }, 00:14:33.209 { 00:14:33.209 "name": "BaseBdev4", 00:14:33.209 "uuid": "b883e9b8-321c-4348-8fe4-91098633d4f6", 00:14:33.209 "is_configured": true, 00:14:33.209 "data_offset": 2048, 00:14:33.209 "data_size": 63488 00:14:33.209 } 00:14:33.209 ] 00:14:33.209 } 00:14:33.209 } 00:14:33.209 }' 00:14:33.209 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:33.209 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:14:33.209 BaseBdev2 00:14:33.209 BaseBdev3 00:14:33.209 BaseBdev4' 00:14:33.209 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:33.209 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:33.209 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:33.774 "name": "BaseBdev1", 00:14:33.774 "aliases": [ 00:14:33.774 "e34e3057-6089-415f-bd80-f38811cef0fc" 00:14:33.774 ], 00:14:33.774 "product_name": "Malloc disk", 00:14:33.774 "block_size": 512, 00:14:33.774 "num_blocks": 65536, 00:14:33.774 "uuid": "e34e3057-6089-415f-bd80-f38811cef0fc", 00:14:33.774 "assigned_rate_limits": { 00:14:33.774 "rw_ios_per_sec": 0, 00:14:33.774 "rw_mbytes_per_sec": 0, 00:14:33.774 "r_mbytes_per_sec": 0, 00:14:33.774 "w_mbytes_per_sec": 0 00:14:33.774 }, 00:14:33.774 "claimed": true, 00:14:33.774 "claim_type": "exclusive_write", 00:14:33.774 "zoned": false, 00:14:33.774 "supported_io_types": { 00:14:33.774 "read": true, 00:14:33.774 "write": true, 00:14:33.774 "unmap": true, 00:14:33.774 "write_zeroes": true, 00:14:33.774 "flush": true, 00:14:33.774 "reset": true, 00:14:33.774 "compare": false, 00:14:33.774 "compare_and_write": false, 00:14:33.774 "abort": true, 00:14:33.774 "nvme_admin": false, 00:14:33.774 "nvme_io": false 00:14:33.774 }, 00:14:33.774 "memory_domains": [ 00:14:33.774 { 00:14:33.774 "dma_device_id": "system", 00:14:33.774 "dma_device_type": 1 00:14:33.774 }, 00:14:33.774 { 00:14:33.774 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.774 "dma_device_type": 2 00:14:33.774 } 00:14:33.774 ], 00:14:33.774 "driver_specific": {} 00:14:33.774 }' 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:33.774 04:15:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:34.032 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:34.032 "name": "BaseBdev2", 00:14:34.032 "aliases": [ 00:14:34.032 "97bc9e38-aa00-4d39-a666-a0d869d3c59d" 00:14:34.032 ], 00:14:34.032 "product_name": "Malloc disk", 00:14:34.032 "block_size": 512, 00:14:34.032 "num_blocks": 65536, 00:14:34.032 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:34.032 "assigned_rate_limits": { 00:14:34.032 "rw_ios_per_sec": 0, 00:14:34.032 "rw_mbytes_per_sec": 0, 00:14:34.032 "r_mbytes_per_sec": 0, 00:14:34.032 "w_mbytes_per_sec": 0 00:14:34.032 }, 00:14:34.032 "claimed": true, 00:14:34.032 "claim_type": "exclusive_write", 00:14:34.032 "zoned": false, 00:14:34.032 "supported_io_types": { 00:14:34.032 "read": true, 00:14:34.032 "write": true, 00:14:34.032 "unmap": true, 00:14:34.032 "write_zeroes": true, 00:14:34.032 "flush": true, 00:14:34.032 "reset": true, 00:14:34.032 "compare": false, 00:14:34.032 "compare_and_write": false, 00:14:34.032 "abort": true, 00:14:34.032 "nvme_admin": false, 00:14:34.032 "nvme_io": false 00:14:34.032 }, 00:14:34.032 "memory_domains": [ 00:14:34.032 { 00:14:34.032 "dma_device_id": "system", 00:14:34.032 "dma_device_type": 1 00:14:34.032 }, 00:14:34.032 { 00:14:34.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.032 "dma_device_type": 2 00:14:34.032 } 00:14:34.032 ], 00:14:34.032 "driver_specific": {} 00:14:34.032 }' 00:14:34.032 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.032 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:34.289 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:34.547 "name": "BaseBdev3", 00:14:34.547 "aliases": [ 00:14:34.547 "4a74d964-ac0d-4ace-8647-55614eb972be" 00:14:34.547 ], 00:14:34.547 "product_name": "Malloc disk", 00:14:34.547 "block_size": 512, 00:14:34.547 "num_blocks": 65536, 00:14:34.547 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:34.547 "assigned_rate_limits": { 00:14:34.547 "rw_ios_per_sec": 0, 00:14:34.547 "rw_mbytes_per_sec": 0, 00:14:34.547 "r_mbytes_per_sec": 0, 00:14:34.547 "w_mbytes_per_sec": 0 00:14:34.547 }, 00:14:34.547 "claimed": true, 00:14:34.547 "claim_type": "exclusive_write", 00:14:34.547 "zoned": false, 00:14:34.547 "supported_io_types": { 00:14:34.547 "read": true, 00:14:34.547 "write": true, 00:14:34.547 "unmap": true, 00:14:34.547 "write_zeroes": true, 00:14:34.547 "flush": true, 00:14:34.547 "reset": true, 00:14:34.547 "compare": false, 00:14:34.547 "compare_and_write": false, 00:14:34.547 "abort": true, 00:14:34.547 "nvme_admin": false, 00:14:34.547 "nvme_io": false 00:14:34.547 }, 00:14:34.547 "memory_domains": [ 00:14:34.547 { 00:14:34.547 "dma_device_id": "system", 00:14:34.547 "dma_device_type": 1 00:14:34.547 }, 00:14:34.547 { 00:14:34.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.547 "dma_device_type": 2 00:14:34.547 } 00:14:34.547 ], 00:14:34.547 "driver_specific": {} 00:14:34.547 }' 00:14:34.547 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:34.804 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:35.061 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:35.061 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:35.061 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:35.062 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:35.062 04:15:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:35.319 "name": "BaseBdev4", 00:14:35.319 "aliases": [ 00:14:35.319 "b883e9b8-321c-4348-8fe4-91098633d4f6" 00:14:35.319 ], 00:14:35.319 "product_name": "Malloc disk", 00:14:35.319 "block_size": 512, 00:14:35.319 "num_blocks": 65536, 00:14:35.319 "uuid": "b883e9b8-321c-4348-8fe4-91098633d4f6", 00:14:35.319 "assigned_rate_limits": { 00:14:35.319 "rw_ios_per_sec": 0, 00:14:35.319 "rw_mbytes_per_sec": 0, 00:14:35.319 "r_mbytes_per_sec": 0, 00:14:35.319 "w_mbytes_per_sec": 0 00:14:35.319 }, 00:14:35.319 "claimed": true, 00:14:35.319 "claim_type": "exclusive_write", 00:14:35.319 "zoned": false, 00:14:35.319 "supported_io_types": { 00:14:35.319 "read": true, 00:14:35.319 "write": true, 00:14:35.319 "unmap": true, 00:14:35.319 "write_zeroes": true, 00:14:35.319 "flush": true, 00:14:35.319 "reset": true, 00:14:35.319 "compare": false, 00:14:35.319 "compare_and_write": false, 00:14:35.319 "abort": true, 00:14:35.319 "nvme_admin": false, 00:14:35.319 "nvme_io": false 00:14:35.319 }, 00:14:35.319 "memory_domains": [ 00:14:35.319 { 00:14:35.319 "dma_device_id": "system", 00:14:35.319 "dma_device_type": 1 00:14:35.319 }, 00:14:35.319 { 00:14:35.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.319 "dma_device_type": 2 00:14:35.319 } 00:14:35.319 ], 00:14:35.319 "driver_specific": {} 00:14:35.319 }' 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:35.319 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:35.577 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:35.577 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:35.578 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:35.578 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:35.578 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:35.836 [2024-05-15 04:15:23.643154] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:35.836 [2024-05-15 04:15:23.643182] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:35.836 [2024-05-15 04:15:23.643236] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:35.836 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:14:35.836 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid0 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.837 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.094 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:36.094 "name": "Existed_Raid", 00:14:36.094 "uuid": "1fb43748-b608-40c3-a81f-3cce31fc4a9e", 00:14:36.094 "strip_size_kb": 64, 00:14:36.094 "state": "offline", 00:14:36.094 "raid_level": "raid0", 00:14:36.094 "superblock": true, 00:14:36.094 "num_base_bdevs": 4, 00:14:36.094 "num_base_bdevs_discovered": 3, 00:14:36.094 "num_base_bdevs_operational": 3, 00:14:36.094 "base_bdevs_list": [ 00:14:36.094 { 00:14:36.094 "name": null, 00:14:36.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.094 "is_configured": false, 00:14:36.094 "data_offset": 2048, 00:14:36.094 "data_size": 63488 00:14:36.094 }, 00:14:36.094 { 00:14:36.094 "name": "BaseBdev2", 00:14:36.094 "uuid": "97bc9e38-aa00-4d39-a666-a0d869d3c59d", 00:14:36.094 "is_configured": true, 00:14:36.094 "data_offset": 2048, 00:14:36.094 "data_size": 63488 00:14:36.094 }, 00:14:36.094 { 00:14:36.094 "name": "BaseBdev3", 00:14:36.094 "uuid": "4a74d964-ac0d-4ace-8647-55614eb972be", 00:14:36.094 "is_configured": true, 00:14:36.094 "data_offset": 2048, 00:14:36.094 "data_size": 63488 00:14:36.094 }, 00:14:36.094 { 00:14:36.094 "name": "BaseBdev4", 00:14:36.094 "uuid": "b883e9b8-321c-4348-8fe4-91098633d4f6", 00:14:36.094 "is_configured": true, 00:14:36.094 "data_offset": 2048, 00:14:36.094 "data_size": 63488 00:14:36.094 } 00:14:36.094 ] 00:14:36.094 }' 00:14:36.094 04:15:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:36.094 04:15:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:36.659 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:14:36.659 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:36.659 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.659 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:36.917 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:36.917 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:36.917 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:37.175 [2024-05-15 04:15:24.949459] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:37.175 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:37.175 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:37.175 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.175 04:15:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:37.432 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:37.432 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:37.432 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:37.690 [2024-05-15 04:15:25.487922] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:37.690 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:37.690 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:37.690 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.690 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:14:37.948 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:14:37.948 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:37.948 04:15:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:14:38.206 [2024-05-15 04:15:25.976533] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:14:38.206 [2024-05-15 04:15:25.976587] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12b07f0 name Existed_Raid, state offline 00:14:38.206 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:14:38.206 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:14:38.206 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.206 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:38.464 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:38.722 BaseBdev2 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:38.722 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.979 04:15:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:39.237 [ 00:14:39.237 { 00:14:39.237 "name": "BaseBdev2", 00:14:39.237 "aliases": [ 00:14:39.237 "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae" 00:14:39.237 ], 00:14:39.237 "product_name": "Malloc disk", 00:14:39.237 "block_size": 512, 00:14:39.237 "num_blocks": 65536, 00:14:39.237 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:39.237 "assigned_rate_limits": { 00:14:39.237 "rw_ios_per_sec": 0, 00:14:39.237 "rw_mbytes_per_sec": 0, 00:14:39.237 "r_mbytes_per_sec": 0, 00:14:39.237 "w_mbytes_per_sec": 0 00:14:39.237 }, 00:14:39.237 "claimed": false, 00:14:39.237 "zoned": false, 00:14:39.237 "supported_io_types": { 00:14:39.237 "read": true, 00:14:39.237 "write": true, 00:14:39.237 "unmap": true, 00:14:39.237 "write_zeroes": true, 00:14:39.237 "flush": true, 00:14:39.237 "reset": true, 00:14:39.237 "compare": false, 00:14:39.237 "compare_and_write": false, 00:14:39.237 "abort": true, 00:14:39.237 "nvme_admin": false, 00:14:39.237 "nvme_io": false 00:14:39.237 }, 00:14:39.237 "memory_domains": [ 00:14:39.237 { 00:14:39.237 "dma_device_id": "system", 00:14:39.237 "dma_device_type": 1 00:14:39.237 }, 00:14:39.237 { 00:14:39.237 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.237 "dma_device_type": 2 00:14:39.237 } 00:14:39.237 ], 00:14:39.237 "driver_specific": {} 00:14:39.237 } 00:14:39.237 ] 00:14:39.237 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:39.237 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:39.237 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:39.237 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:39.494 BaseBdev3 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:39.494 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:39.495 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:39.752 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:40.010 [ 00:14:40.010 { 00:14:40.010 "name": "BaseBdev3", 00:14:40.010 "aliases": [ 00:14:40.010 "d27965af-0e90-41e5-8237-3bfe64be15e1" 00:14:40.010 ], 00:14:40.010 "product_name": "Malloc disk", 00:14:40.010 "block_size": 512, 00:14:40.010 "num_blocks": 65536, 00:14:40.010 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:40.010 "assigned_rate_limits": { 00:14:40.010 "rw_ios_per_sec": 0, 00:14:40.010 "rw_mbytes_per_sec": 0, 00:14:40.010 "r_mbytes_per_sec": 0, 00:14:40.010 "w_mbytes_per_sec": 0 00:14:40.010 }, 00:14:40.010 "claimed": false, 00:14:40.010 "zoned": false, 00:14:40.010 "supported_io_types": { 00:14:40.010 "read": true, 00:14:40.010 "write": true, 00:14:40.010 "unmap": true, 00:14:40.010 "write_zeroes": true, 00:14:40.010 "flush": true, 00:14:40.010 "reset": true, 00:14:40.010 "compare": false, 00:14:40.010 "compare_and_write": false, 00:14:40.010 "abort": true, 00:14:40.010 "nvme_admin": false, 00:14:40.010 "nvme_io": false 00:14:40.010 }, 00:14:40.010 "memory_domains": [ 00:14:40.010 { 00:14:40.010 "dma_device_id": "system", 00:14:40.010 "dma_device_type": 1 00:14:40.010 }, 00:14:40.010 { 00:14:40.010 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.010 "dma_device_type": 2 00:14:40.010 } 00:14:40.010 ], 00:14:40.010 "driver_specific": {} 00:14:40.010 } 00:14:40.010 ] 00:14:40.010 04:15:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:40.010 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:40.010 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:40.010 04:15:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:14:40.268 BaseBdev4 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:40.268 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:40.527 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:14:40.785 [ 00:14:40.785 { 00:14:40.785 "name": "BaseBdev4", 00:14:40.785 "aliases": [ 00:14:40.785 "ba53a54c-a984-4ac4-a928-9c825c9657a3" 00:14:40.785 ], 00:14:40.785 "product_name": "Malloc disk", 00:14:40.785 "block_size": 512, 00:14:40.785 "num_blocks": 65536, 00:14:40.785 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:40.785 "assigned_rate_limits": { 00:14:40.785 "rw_ios_per_sec": 0, 00:14:40.785 "rw_mbytes_per_sec": 0, 00:14:40.785 "r_mbytes_per_sec": 0, 00:14:40.785 "w_mbytes_per_sec": 0 00:14:40.785 }, 00:14:40.785 "claimed": false, 00:14:40.785 "zoned": false, 00:14:40.785 "supported_io_types": { 00:14:40.785 "read": true, 00:14:40.785 "write": true, 00:14:40.785 "unmap": true, 00:14:40.785 "write_zeroes": true, 00:14:40.785 "flush": true, 00:14:40.785 "reset": true, 00:14:40.785 "compare": false, 00:14:40.785 "compare_and_write": false, 00:14:40.785 "abort": true, 00:14:40.785 "nvme_admin": false, 00:14:40.785 "nvme_io": false 00:14:40.785 }, 00:14:40.785 "memory_domains": [ 00:14:40.785 { 00:14:40.785 "dma_device_id": "system", 00:14:40.785 "dma_device_type": 1 00:14:40.785 }, 00:14:40.785 { 00:14:40.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:40.785 "dma_device_type": 2 00:14:40.785 } 00:14:40.785 ], 00:14:40.785 "driver_specific": {} 00:14:40.785 } 00:14:40.785 ] 00:14:40.785 04:15:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:40.785 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:14:40.785 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:14:40.785 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:14:41.043 [2024-05-15 04:15:28.977503] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:41.043 [2024-05-15 04:15:28.977547] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:41.043 [2024-05-15 04:15:28.977575] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:41.043 [2024-05-15 04:15:28.979015] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:41.043 [2024-05-15 04:15:28.979062] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.043 04:15:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.301 04:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:41.301 "name": "Existed_Raid", 00:14:41.301 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:41.301 "strip_size_kb": 64, 00:14:41.301 "state": "configuring", 00:14:41.301 "raid_level": "raid0", 00:14:41.301 "superblock": true, 00:14:41.301 "num_base_bdevs": 4, 00:14:41.301 "num_base_bdevs_discovered": 3, 00:14:41.301 "num_base_bdevs_operational": 4, 00:14:41.301 "base_bdevs_list": [ 00:14:41.301 { 00:14:41.301 "name": "BaseBdev1", 00:14:41.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.301 "is_configured": false, 00:14:41.301 "data_offset": 0, 00:14:41.301 "data_size": 0 00:14:41.301 }, 00:14:41.301 { 00:14:41.301 "name": "BaseBdev2", 00:14:41.301 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:41.301 "is_configured": true, 00:14:41.301 "data_offset": 2048, 00:14:41.301 "data_size": 63488 00:14:41.301 }, 00:14:41.301 { 00:14:41.302 "name": "BaseBdev3", 00:14:41.302 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:41.302 "is_configured": true, 00:14:41.302 "data_offset": 2048, 00:14:41.302 "data_size": 63488 00:14:41.302 }, 00:14:41.302 { 00:14:41.302 "name": "BaseBdev4", 00:14:41.302 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:41.302 "is_configured": true, 00:14:41.302 "data_offset": 2048, 00:14:41.302 "data_size": 63488 00:14:41.302 } 00:14:41.302 ] 00:14:41.302 }' 00:14:41.302 04:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:41.302 04:15:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:41.867 04:15:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:42.126 [2024-05-15 04:15:30.024234] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.126 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.384 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:42.384 "name": "Existed_Raid", 00:14:42.384 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:42.384 "strip_size_kb": 64, 00:14:42.384 "state": "configuring", 00:14:42.384 "raid_level": "raid0", 00:14:42.384 "superblock": true, 00:14:42.384 "num_base_bdevs": 4, 00:14:42.384 "num_base_bdevs_discovered": 2, 00:14:42.384 "num_base_bdevs_operational": 4, 00:14:42.384 "base_bdevs_list": [ 00:14:42.384 { 00:14:42.384 "name": "BaseBdev1", 00:14:42.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.384 "is_configured": false, 00:14:42.384 "data_offset": 0, 00:14:42.384 "data_size": 0 00:14:42.384 }, 00:14:42.384 { 00:14:42.384 "name": null, 00:14:42.384 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:42.384 "is_configured": false, 00:14:42.384 "data_offset": 2048, 00:14:42.384 "data_size": 63488 00:14:42.384 }, 00:14:42.384 { 00:14:42.384 "name": "BaseBdev3", 00:14:42.384 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:42.384 "is_configured": true, 00:14:42.384 "data_offset": 2048, 00:14:42.384 "data_size": 63488 00:14:42.384 }, 00:14:42.384 { 00:14:42.384 "name": "BaseBdev4", 00:14:42.384 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:42.384 "is_configured": true, 00:14:42.384 "data_offset": 2048, 00:14:42.384 "data_size": 63488 00:14:42.384 } 00:14:42.384 ] 00:14:42.384 }' 00:14:42.384 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:42.384 04:15:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.950 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.950 04:15:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:43.208 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:14:43.208 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:43.466 [2024-05-15 04:15:31.341819] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:43.466 BaseBdev1 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:43.466 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:43.724 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:43.982 [ 00:14:43.982 { 00:14:43.982 "name": "BaseBdev1", 00:14:43.982 "aliases": [ 00:14:43.983 "f25d5b92-593e-4820-8eeb-beea1def140e" 00:14:43.983 ], 00:14:43.983 "product_name": "Malloc disk", 00:14:43.983 "block_size": 512, 00:14:43.983 "num_blocks": 65536, 00:14:43.983 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:43.983 "assigned_rate_limits": { 00:14:43.983 "rw_ios_per_sec": 0, 00:14:43.983 "rw_mbytes_per_sec": 0, 00:14:43.983 "r_mbytes_per_sec": 0, 00:14:43.983 "w_mbytes_per_sec": 0 00:14:43.983 }, 00:14:43.983 "claimed": true, 00:14:43.983 "claim_type": "exclusive_write", 00:14:43.983 "zoned": false, 00:14:43.983 "supported_io_types": { 00:14:43.983 "read": true, 00:14:43.983 "write": true, 00:14:43.983 "unmap": true, 00:14:43.983 "write_zeroes": true, 00:14:43.983 "flush": true, 00:14:43.983 "reset": true, 00:14:43.983 "compare": false, 00:14:43.983 "compare_and_write": false, 00:14:43.983 "abort": true, 00:14:43.983 "nvme_admin": false, 00:14:43.983 "nvme_io": false 00:14:43.983 }, 00:14:43.983 "memory_domains": [ 00:14:43.983 { 00:14:43.983 "dma_device_id": "system", 00:14:43.983 "dma_device_type": 1 00:14:43.983 }, 00:14:43.983 { 00:14:43.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.983 "dma_device_type": 2 00:14:43.983 } 00:14:43.983 ], 00:14:43.983 "driver_specific": {} 00:14:43.983 } 00:14:43.983 ] 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.983 04:15:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.241 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:44.241 "name": "Existed_Raid", 00:14:44.241 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:44.241 "strip_size_kb": 64, 00:14:44.241 "state": "configuring", 00:14:44.241 "raid_level": "raid0", 00:14:44.241 "superblock": true, 00:14:44.241 "num_base_bdevs": 4, 00:14:44.241 "num_base_bdevs_discovered": 3, 00:14:44.241 "num_base_bdevs_operational": 4, 00:14:44.241 "base_bdevs_list": [ 00:14:44.241 { 00:14:44.241 "name": "BaseBdev1", 00:14:44.241 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:44.241 "is_configured": true, 00:14:44.241 "data_offset": 2048, 00:14:44.241 "data_size": 63488 00:14:44.241 }, 00:14:44.241 { 00:14:44.241 "name": null, 00:14:44.241 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:44.241 "is_configured": false, 00:14:44.241 "data_offset": 2048, 00:14:44.241 "data_size": 63488 00:14:44.241 }, 00:14:44.241 { 00:14:44.241 "name": "BaseBdev3", 00:14:44.241 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:44.241 "is_configured": true, 00:14:44.241 "data_offset": 2048, 00:14:44.241 "data_size": 63488 00:14:44.241 }, 00:14:44.241 { 00:14:44.241 "name": "BaseBdev4", 00:14:44.241 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:44.241 "is_configured": true, 00:14:44.241 "data_offset": 2048, 00:14:44.241 "data_size": 63488 00:14:44.241 } 00:14:44.241 ] 00:14:44.241 }' 00:14:44.241 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:44.241 04:15:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.807 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.807 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:45.066 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:14:45.066 04:15:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:45.324 [2024-05-15 04:15:33.262933] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.324 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.582 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:45.582 "name": "Existed_Raid", 00:14:45.582 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:45.582 "strip_size_kb": 64, 00:14:45.582 "state": "configuring", 00:14:45.582 "raid_level": "raid0", 00:14:45.582 "superblock": true, 00:14:45.582 "num_base_bdevs": 4, 00:14:45.582 "num_base_bdevs_discovered": 2, 00:14:45.582 "num_base_bdevs_operational": 4, 00:14:45.582 "base_bdevs_list": [ 00:14:45.582 { 00:14:45.582 "name": "BaseBdev1", 00:14:45.582 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:45.582 "is_configured": true, 00:14:45.582 "data_offset": 2048, 00:14:45.582 "data_size": 63488 00:14:45.582 }, 00:14:45.582 { 00:14:45.582 "name": null, 00:14:45.582 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:45.582 "is_configured": false, 00:14:45.582 "data_offset": 2048, 00:14:45.582 "data_size": 63488 00:14:45.582 }, 00:14:45.582 { 00:14:45.582 "name": null, 00:14:45.582 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:45.582 "is_configured": false, 00:14:45.582 "data_offset": 2048, 00:14:45.582 "data_size": 63488 00:14:45.582 }, 00:14:45.582 { 00:14:45.582 "name": "BaseBdev4", 00:14:45.582 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:45.582 "is_configured": true, 00:14:45.582 "data_offset": 2048, 00:14:45.582 "data_size": 63488 00:14:45.582 } 00:14:45.582 ] 00:14:45.582 }' 00:14:45.582 04:15:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:45.582 04:15:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.148 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.148 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:46.406 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:14:46.406 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:46.664 [2024-05-15 04:15:34.594550] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.664 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.922 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:46.922 "name": "Existed_Raid", 00:14:46.922 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:46.922 "strip_size_kb": 64, 00:14:46.922 "state": "configuring", 00:14:46.922 "raid_level": "raid0", 00:14:46.922 "superblock": true, 00:14:46.922 "num_base_bdevs": 4, 00:14:46.922 "num_base_bdevs_discovered": 3, 00:14:46.922 "num_base_bdevs_operational": 4, 00:14:46.922 "base_bdevs_list": [ 00:14:46.922 { 00:14:46.922 "name": "BaseBdev1", 00:14:46.922 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:46.922 "is_configured": true, 00:14:46.922 "data_offset": 2048, 00:14:46.922 "data_size": 63488 00:14:46.922 }, 00:14:46.922 { 00:14:46.922 "name": null, 00:14:46.922 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:46.922 "is_configured": false, 00:14:46.922 "data_offset": 2048, 00:14:46.922 "data_size": 63488 00:14:46.922 }, 00:14:46.922 { 00:14:46.922 "name": "BaseBdev3", 00:14:46.922 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:46.922 "is_configured": true, 00:14:46.922 "data_offset": 2048, 00:14:46.922 "data_size": 63488 00:14:46.922 }, 00:14:46.922 { 00:14:46.922 "name": "BaseBdev4", 00:14:46.922 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:46.922 "is_configured": true, 00:14:46.922 "data_offset": 2048, 00:14:46.922 "data_size": 63488 00:14:46.922 } 00:14:46.922 ] 00:14:46.922 }' 00:14:46.922 04:15:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:46.922 04:15:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:47.485 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.485 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:47.742 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:14:47.742 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:47.999 [2024-05-15 04:15:35.886031] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.999 04:15:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:48.258 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:48.258 "name": "Existed_Raid", 00:14:48.258 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:48.258 "strip_size_kb": 64, 00:14:48.258 "state": "configuring", 00:14:48.258 "raid_level": "raid0", 00:14:48.258 "superblock": true, 00:14:48.258 "num_base_bdevs": 4, 00:14:48.258 "num_base_bdevs_discovered": 2, 00:14:48.258 "num_base_bdevs_operational": 4, 00:14:48.258 "base_bdevs_list": [ 00:14:48.258 { 00:14:48.258 "name": null, 00:14:48.258 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:48.258 "is_configured": false, 00:14:48.258 "data_offset": 2048, 00:14:48.258 "data_size": 63488 00:14:48.258 }, 00:14:48.258 { 00:14:48.258 "name": null, 00:14:48.258 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:48.258 "is_configured": false, 00:14:48.258 "data_offset": 2048, 00:14:48.258 "data_size": 63488 00:14:48.258 }, 00:14:48.258 { 00:14:48.258 "name": "BaseBdev3", 00:14:48.258 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:48.258 "is_configured": true, 00:14:48.258 "data_offset": 2048, 00:14:48.258 "data_size": 63488 00:14:48.258 }, 00:14:48.258 { 00:14:48.258 "name": "BaseBdev4", 00:14:48.258 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:48.258 "is_configured": true, 00:14:48.258 "data_offset": 2048, 00:14:48.258 "data_size": 63488 00:14:48.258 } 00:14:48.258 ] 00:14:48.258 }' 00:14:48.258 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:48.258 04:15:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.823 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.823 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:49.081 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:14:49.081 04:15:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:49.339 [2024-05-15 04:15:37.222296] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.339 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.597 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:49.597 "name": "Existed_Raid", 00:14:49.597 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:49.597 "strip_size_kb": 64, 00:14:49.597 "state": "configuring", 00:14:49.597 "raid_level": "raid0", 00:14:49.597 "superblock": true, 00:14:49.597 "num_base_bdevs": 4, 00:14:49.597 "num_base_bdevs_discovered": 3, 00:14:49.597 "num_base_bdevs_operational": 4, 00:14:49.597 "base_bdevs_list": [ 00:14:49.597 { 00:14:49.597 "name": null, 00:14:49.597 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:49.597 "is_configured": false, 00:14:49.597 "data_offset": 2048, 00:14:49.597 "data_size": 63488 00:14:49.597 }, 00:14:49.597 { 00:14:49.597 "name": "BaseBdev2", 00:14:49.597 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:49.597 "is_configured": true, 00:14:49.597 "data_offset": 2048, 00:14:49.597 "data_size": 63488 00:14:49.597 }, 00:14:49.597 { 00:14:49.597 "name": "BaseBdev3", 00:14:49.597 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:49.597 "is_configured": true, 00:14:49.597 "data_offset": 2048, 00:14:49.597 "data_size": 63488 00:14:49.597 }, 00:14:49.597 { 00:14:49.597 "name": "BaseBdev4", 00:14:49.597 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:49.597 "is_configured": true, 00:14:49.597 "data_offset": 2048, 00:14:49.597 "data_size": 63488 00:14:49.597 } 00:14:49.597 ] 00:14:49.597 }' 00:14:49.597 04:15:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:49.597 04:15:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:50.164 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.164 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:50.421 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:14:50.421 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.421 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:50.679 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f25d5b92-593e-4820-8eeb-beea1def140e 00:14:50.937 [2024-05-15 04:15:38.876707] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:50.937 [2024-05-15 04:15:38.876958] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1456690 00:14:50.937 [2024-05-15 04:15:38.876974] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:50.938 [2024-05-15 04:15:38.877125] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1456970 00:14:50.938 [2024-05-15 04:15:38.877268] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1456690 00:14:50.938 [2024-05-15 04:15:38.877281] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1456690 00:14:50.938 [2024-05-15 04:15:38.877369] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:50.938 NewBaseBdev 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:14:50.938 04:15:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.195 04:15:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:51.453 [ 00:14:51.453 { 00:14:51.453 "name": "NewBaseBdev", 00:14:51.453 "aliases": [ 00:14:51.453 "f25d5b92-593e-4820-8eeb-beea1def140e" 00:14:51.453 ], 00:14:51.453 "product_name": "Malloc disk", 00:14:51.453 "block_size": 512, 00:14:51.453 "num_blocks": 65536, 00:14:51.453 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:51.453 "assigned_rate_limits": { 00:14:51.453 "rw_ios_per_sec": 0, 00:14:51.453 "rw_mbytes_per_sec": 0, 00:14:51.453 "r_mbytes_per_sec": 0, 00:14:51.453 "w_mbytes_per_sec": 0 00:14:51.453 }, 00:14:51.453 "claimed": true, 00:14:51.453 "claim_type": "exclusive_write", 00:14:51.453 "zoned": false, 00:14:51.454 "supported_io_types": { 00:14:51.454 "read": true, 00:14:51.454 "write": true, 00:14:51.454 "unmap": true, 00:14:51.454 "write_zeroes": true, 00:14:51.454 "flush": true, 00:14:51.454 "reset": true, 00:14:51.454 "compare": false, 00:14:51.454 "compare_and_write": false, 00:14:51.454 "abort": true, 00:14:51.454 "nvme_admin": false, 00:14:51.454 "nvme_io": false 00:14:51.454 }, 00:14:51.454 "memory_domains": [ 00:14:51.454 { 00:14:51.454 "dma_device_id": "system", 00:14:51.454 "dma_device_type": 1 00:14:51.454 }, 00:14:51.454 { 00:14:51.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.454 "dma_device_type": 2 00:14:51.454 } 00:14:51.454 ], 00:14:51.454 "driver_specific": {} 00:14:51.454 } 00:14:51.454 ] 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.454 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.712 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:51.712 "name": "Existed_Raid", 00:14:51.712 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:51.712 "strip_size_kb": 64, 00:14:51.712 "state": "online", 00:14:51.712 "raid_level": "raid0", 00:14:51.712 "superblock": true, 00:14:51.712 "num_base_bdevs": 4, 00:14:51.712 "num_base_bdevs_discovered": 4, 00:14:51.712 "num_base_bdevs_operational": 4, 00:14:51.712 "base_bdevs_list": [ 00:14:51.712 { 00:14:51.712 "name": "NewBaseBdev", 00:14:51.712 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:51.712 "is_configured": true, 00:14:51.712 "data_offset": 2048, 00:14:51.712 "data_size": 63488 00:14:51.712 }, 00:14:51.712 { 00:14:51.712 "name": "BaseBdev2", 00:14:51.712 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:51.712 "is_configured": true, 00:14:51.712 "data_offset": 2048, 00:14:51.712 "data_size": 63488 00:14:51.712 }, 00:14:51.712 { 00:14:51.712 "name": "BaseBdev3", 00:14:51.712 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:51.712 "is_configured": true, 00:14:51.712 "data_offset": 2048, 00:14:51.712 "data_size": 63488 00:14:51.712 }, 00:14:51.712 { 00:14:51.712 "name": "BaseBdev4", 00:14:51.712 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:51.712 "is_configured": true, 00:14:51.712 "data_offset": 2048, 00:14:51.712 "data_size": 63488 00:14:51.712 } 00:14:51.712 ] 00:14:51.712 }' 00:14:51.712 04:15:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:51.712 04:15:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:52.278 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:52.584 [2024-05-15 04:15:40.465219] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:52.584 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:52.584 "name": "Existed_Raid", 00:14:52.584 "aliases": [ 00:14:52.584 "5c864038-7cee-4c29-8089-28a95ab7aefb" 00:14:52.584 ], 00:14:52.584 "product_name": "Raid Volume", 00:14:52.584 "block_size": 512, 00:14:52.584 "num_blocks": 253952, 00:14:52.584 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:52.584 "assigned_rate_limits": { 00:14:52.584 "rw_ios_per_sec": 0, 00:14:52.584 "rw_mbytes_per_sec": 0, 00:14:52.584 "r_mbytes_per_sec": 0, 00:14:52.584 "w_mbytes_per_sec": 0 00:14:52.584 }, 00:14:52.584 "claimed": false, 00:14:52.584 "zoned": false, 00:14:52.584 "supported_io_types": { 00:14:52.584 "read": true, 00:14:52.584 "write": true, 00:14:52.584 "unmap": true, 00:14:52.584 "write_zeroes": true, 00:14:52.584 "flush": true, 00:14:52.584 "reset": true, 00:14:52.584 "compare": false, 00:14:52.584 "compare_and_write": false, 00:14:52.584 "abort": false, 00:14:52.584 "nvme_admin": false, 00:14:52.585 "nvme_io": false 00:14:52.585 }, 00:14:52.585 "memory_domains": [ 00:14:52.585 { 00:14:52.585 "dma_device_id": "system", 00:14:52.585 "dma_device_type": 1 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.585 "dma_device_type": 2 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "system", 00:14:52.585 "dma_device_type": 1 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.585 "dma_device_type": 2 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "system", 00:14:52.585 "dma_device_type": 1 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.585 "dma_device_type": 2 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "system", 00:14:52.585 "dma_device_type": 1 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.585 "dma_device_type": 2 00:14:52.585 } 00:14:52.585 ], 00:14:52.585 "driver_specific": { 00:14:52.585 "raid": { 00:14:52.585 "uuid": "5c864038-7cee-4c29-8089-28a95ab7aefb", 00:14:52.585 "strip_size_kb": 64, 00:14:52.585 "state": "online", 00:14:52.585 "raid_level": "raid0", 00:14:52.585 "superblock": true, 00:14:52.585 "num_base_bdevs": 4, 00:14:52.585 "num_base_bdevs_discovered": 4, 00:14:52.585 "num_base_bdevs_operational": 4, 00:14:52.585 "base_bdevs_list": [ 00:14:52.585 { 00:14:52.585 "name": "NewBaseBdev", 00:14:52.585 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:52.585 "is_configured": true, 00:14:52.585 "data_offset": 2048, 00:14:52.585 "data_size": 63488 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "name": "BaseBdev2", 00:14:52.585 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:52.585 "is_configured": true, 00:14:52.585 "data_offset": 2048, 00:14:52.585 "data_size": 63488 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "name": "BaseBdev3", 00:14:52.585 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:52.585 "is_configured": true, 00:14:52.585 "data_offset": 2048, 00:14:52.585 "data_size": 63488 00:14:52.585 }, 00:14:52.585 { 00:14:52.585 "name": "BaseBdev4", 00:14:52.585 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:52.585 "is_configured": true, 00:14:52.585 "data_offset": 2048, 00:14:52.585 "data_size": 63488 00:14:52.585 } 00:14:52.585 ] 00:14:52.585 } 00:14:52.585 } 00:14:52.585 }' 00:14:52.585 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:52.585 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:14:52.585 BaseBdev2 00:14:52.585 BaseBdev3 00:14:52.585 BaseBdev4' 00:14:52.585 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:52.585 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:52.585 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:52.860 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:52.860 "name": "NewBaseBdev", 00:14:52.860 "aliases": [ 00:14:52.860 "f25d5b92-593e-4820-8eeb-beea1def140e" 00:14:52.860 ], 00:14:52.860 "product_name": "Malloc disk", 00:14:52.860 "block_size": 512, 00:14:52.860 "num_blocks": 65536, 00:14:52.860 "uuid": "f25d5b92-593e-4820-8eeb-beea1def140e", 00:14:52.860 "assigned_rate_limits": { 00:14:52.860 "rw_ios_per_sec": 0, 00:14:52.860 "rw_mbytes_per_sec": 0, 00:14:52.860 "r_mbytes_per_sec": 0, 00:14:52.860 "w_mbytes_per_sec": 0 00:14:52.860 }, 00:14:52.860 "claimed": true, 00:14:52.860 "claim_type": "exclusive_write", 00:14:52.860 "zoned": false, 00:14:52.860 "supported_io_types": { 00:14:52.860 "read": true, 00:14:52.860 "write": true, 00:14:52.860 "unmap": true, 00:14:52.860 "write_zeroes": true, 00:14:52.860 "flush": true, 00:14:52.860 "reset": true, 00:14:52.860 "compare": false, 00:14:52.860 "compare_and_write": false, 00:14:52.860 "abort": true, 00:14:52.860 "nvme_admin": false, 00:14:52.860 "nvme_io": false 00:14:52.860 }, 00:14:52.860 "memory_domains": [ 00:14:52.860 { 00:14:52.860 "dma_device_id": "system", 00:14:52.860 "dma_device_type": 1 00:14:52.860 }, 00:14:52.860 { 00:14:52.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.860 "dma_device_type": 2 00:14:52.860 } 00:14:52.860 ], 00:14:52.860 "driver_specific": {} 00:14:52.860 }' 00:14:52.860 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:52.860 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:52.860 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:52.860 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.118 04:15:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:53.118 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:53.118 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:53.118 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:53.118 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:53.118 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:53.376 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:53.376 "name": "BaseBdev2", 00:14:53.376 "aliases": [ 00:14:53.376 "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae" 00:14:53.376 ], 00:14:53.376 "product_name": "Malloc disk", 00:14:53.376 "block_size": 512, 00:14:53.376 "num_blocks": 65536, 00:14:53.376 "uuid": "b6ca27bd-6fe3-4e69-a3c0-41ad57aed6ae", 00:14:53.376 "assigned_rate_limits": { 00:14:53.376 "rw_ios_per_sec": 0, 00:14:53.376 "rw_mbytes_per_sec": 0, 00:14:53.376 "r_mbytes_per_sec": 0, 00:14:53.376 "w_mbytes_per_sec": 0 00:14:53.376 }, 00:14:53.376 "claimed": true, 00:14:53.376 "claim_type": "exclusive_write", 00:14:53.376 "zoned": false, 00:14:53.376 "supported_io_types": { 00:14:53.376 "read": true, 00:14:53.376 "write": true, 00:14:53.376 "unmap": true, 00:14:53.376 "write_zeroes": true, 00:14:53.376 "flush": true, 00:14:53.376 "reset": true, 00:14:53.376 "compare": false, 00:14:53.376 "compare_and_write": false, 00:14:53.376 "abort": true, 00:14:53.376 "nvme_admin": false, 00:14:53.376 "nvme_io": false 00:14:53.376 }, 00:14:53.376 "memory_domains": [ 00:14:53.376 { 00:14:53.376 "dma_device_id": "system", 00:14:53.376 "dma_device_type": 1 00:14:53.376 }, 00:14:53.376 { 00:14:53.376 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.376 "dma_device_type": 2 00:14:53.376 } 00:14:53.376 ], 00:14:53.376 "driver_specific": {} 00:14:53.376 }' 00:14:53.376 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:53.376 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:53.376 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:53.376 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:53.634 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:53.892 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:53.892 "name": "BaseBdev3", 00:14:53.892 "aliases": [ 00:14:53.892 "d27965af-0e90-41e5-8237-3bfe64be15e1" 00:14:53.892 ], 00:14:53.892 "product_name": "Malloc disk", 00:14:53.892 "block_size": 512, 00:14:53.892 "num_blocks": 65536, 00:14:53.892 "uuid": "d27965af-0e90-41e5-8237-3bfe64be15e1", 00:14:53.892 "assigned_rate_limits": { 00:14:53.892 "rw_ios_per_sec": 0, 00:14:53.892 "rw_mbytes_per_sec": 0, 00:14:53.892 "r_mbytes_per_sec": 0, 00:14:53.892 "w_mbytes_per_sec": 0 00:14:53.892 }, 00:14:53.892 "claimed": true, 00:14:53.892 "claim_type": "exclusive_write", 00:14:53.892 "zoned": false, 00:14:53.892 "supported_io_types": { 00:14:53.892 "read": true, 00:14:53.892 "write": true, 00:14:53.892 "unmap": true, 00:14:53.892 "write_zeroes": true, 00:14:53.892 "flush": true, 00:14:53.892 "reset": true, 00:14:53.892 "compare": false, 00:14:53.892 "compare_and_write": false, 00:14:53.892 "abort": true, 00:14:53.892 "nvme_admin": false, 00:14:53.892 "nvme_io": false 00:14:53.892 }, 00:14:53.892 "memory_domains": [ 00:14:53.892 { 00:14:53.892 "dma_device_id": "system", 00:14:53.892 "dma_device_type": 1 00:14:53.892 }, 00:14:53.892 { 00:14:53.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.892 "dma_device_type": 2 00:14:53.892 } 00:14:53.892 ], 00:14:53.892 "driver_specific": {} 00:14:53.892 }' 00:14:53.892 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:53.892 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:53.892 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:53.892 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:54.150 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:54.150 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:54.150 04:15:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:14:54.150 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:14:54.408 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:14:54.408 "name": "BaseBdev4", 00:14:54.408 "aliases": [ 00:14:54.408 "ba53a54c-a984-4ac4-a928-9c825c9657a3" 00:14:54.408 ], 00:14:54.408 "product_name": "Malloc disk", 00:14:54.408 "block_size": 512, 00:14:54.408 "num_blocks": 65536, 00:14:54.408 "uuid": "ba53a54c-a984-4ac4-a928-9c825c9657a3", 00:14:54.408 "assigned_rate_limits": { 00:14:54.408 "rw_ios_per_sec": 0, 00:14:54.408 "rw_mbytes_per_sec": 0, 00:14:54.408 "r_mbytes_per_sec": 0, 00:14:54.408 "w_mbytes_per_sec": 0 00:14:54.408 }, 00:14:54.408 "claimed": true, 00:14:54.408 "claim_type": "exclusive_write", 00:14:54.408 "zoned": false, 00:14:54.408 "supported_io_types": { 00:14:54.408 "read": true, 00:14:54.408 "write": true, 00:14:54.408 "unmap": true, 00:14:54.408 "write_zeroes": true, 00:14:54.408 "flush": true, 00:14:54.408 "reset": true, 00:14:54.408 "compare": false, 00:14:54.408 "compare_and_write": false, 00:14:54.408 "abort": true, 00:14:54.408 "nvme_admin": false, 00:14:54.408 "nvme_io": false 00:14:54.408 }, 00:14:54.408 "memory_domains": [ 00:14:54.408 { 00:14:54.408 "dma_device_id": "system", 00:14:54.408 "dma_device_type": 1 00:14:54.408 }, 00:14:54.408 { 00:14:54.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.408 "dma_device_type": 2 00:14:54.408 } 00:14:54.408 ], 00:14:54.408 "driver_specific": {} 00:14:54.408 }' 00:14:54.408 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:54.408 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:14:54.666 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:54.924 [2024-05-15 04:15:42.875358] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:54.924 [2024-05-15 04:15:42.875385] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:54.924 [2024-05-15 04:15:42.875461] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:54.924 [2024-05-15 04:15:42.875527] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:54.924 [2024-05-15 04:15:42.875542] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1456690 name Existed_Raid, state offline 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3874520 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3874520 ']' 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3874520 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3874520 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3874520' 00:14:54.924 killing process with pid 3874520 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3874520 00:14:54.924 [2024-05-15 04:15:42.917010] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:54.924 04:15:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3874520 00:14:55.182 [2024-05-15 04:15:42.965178] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:55.440 04:15:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:14:55.440 00:14:55.440 real 0m31.894s 00:14:55.440 user 0m59.908s 00:14:55.440 sys 0m4.311s 00:14:55.440 04:15:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:14:55.440 04:15:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:55.440 ************************************ 00:14:55.440 END TEST raid_state_function_test_sb 00:14:55.440 ************************************ 00:14:55.440 04:15:43 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:14:55.440 04:15:43 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:14:55.440 04:15:43 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:14:55.440 04:15:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:55.440 ************************************ 00:14:55.440 START TEST raid_superblock_test 00:14:55.440 ************************************ 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid0 4 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid0 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid0 '!=' raid1 ']' 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3878924 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3878924 /var/tmp/spdk-raid.sock 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3878924 ']' 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:55.440 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:14:55.440 04:15:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.440 [2024-05-15 04:15:43.362276] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:14:55.440 [2024-05-15 04:15:43.362359] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3878924 ] 00:14:55.440 [2024-05-15 04:15:43.444416] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:55.698 [2024-05-15 04:15:43.565607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:14:55.698 [2024-05-15 04:15:43.643289] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:55.698 [2024-05-15 04:15:43.643330] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:56.631 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:56.631 malloc1 00:14:56.889 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:57.147 [2024-05-15 04:15:44.917950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:57.147 [2024-05-15 04:15:44.918013] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.147 [2024-05-15 04:15:44.918039] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa6c20 00:14:57.147 [2024-05-15 04:15:44.918055] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.147 [2024-05-15 04:15:44.919587] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.147 [2024-05-15 04:15:44.919616] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:57.147 pt1 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:57.147 04:15:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:57.405 malloc2 00:14:57.405 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:57.664 [2024-05-15 04:15:45.422489] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:57.664 [2024-05-15 04:15:45.422555] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.664 [2024-05-15 04:15:45.422580] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f9ec00 00:14:57.664 [2024-05-15 04:15:45.422596] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.664 [2024-05-15 04:15:45.424391] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.664 [2024-05-15 04:15:45.424420] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:57.664 pt2 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:57.664 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:57.664 malloc3 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:57.922 [2024-05-15 04:15:45.914380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:57.922 [2024-05-15 04:15:45.914447] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.922 [2024-05-15 04:15:45.914475] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x214f9c0 00:14:57.922 [2024-05-15 04:15:45.914491] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.922 [2024-05-15 04:15:45.915934] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.922 [2024-05-15 04:15:45.915962] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:57.922 pt3 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:57.922 04:15:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:14:58.180 malloc4 00:14:58.180 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:14:58.438 [2024-05-15 04:15:46.402609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:14:58.438 [2024-05-15 04:15:46.402680] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:58.438 [2024-05-15 04:15:46.402705] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa28e0 00:14:58.438 [2024-05-15 04:15:46.402720] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:58.438 [2024-05-15 04:15:46.404112] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:58.438 [2024-05-15 04:15:46.404141] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:14:58.438 pt4 00:14:58.438 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:14:58.438 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:14:58.438 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:14:58.695 [2024-05-15 04:15:46.643264] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:58.695 [2024-05-15 04:15:46.644419] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:58.695 [2024-05-15 04:15:46.644487] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:58.695 [2024-05-15 04:15:46.644545] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:14:58.695 [2024-05-15 04:15:46.644744] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa1cc0 00:14:58.695 [2024-05-15 04:15:46.644770] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:14:58.695 [2024-05-15 04:15:46.644963] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f9f190 00:14:58.695 [2024-05-15 04:15:46.645123] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa1cc0 00:14:58.695 [2024-05-15 04:15:46.645139] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fa1cc0 00:14:58.695 [2024-05-15 04:15:46.645241] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:58.695 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:14:58.695 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:14:58.695 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:14:58.695 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:14:58.695 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.696 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:58.954 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:14:58.954 "name": "raid_bdev1", 00:14:58.954 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:14:58.954 "strip_size_kb": 64, 00:14:58.954 "state": "online", 00:14:58.954 "raid_level": "raid0", 00:14:58.954 "superblock": true, 00:14:58.954 "num_base_bdevs": 4, 00:14:58.954 "num_base_bdevs_discovered": 4, 00:14:58.954 "num_base_bdevs_operational": 4, 00:14:58.954 "base_bdevs_list": [ 00:14:58.954 { 00:14:58.954 "name": "pt1", 00:14:58.954 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:14:58.954 "is_configured": true, 00:14:58.954 "data_offset": 2048, 00:14:58.954 "data_size": 63488 00:14:58.954 }, 00:14:58.954 { 00:14:58.954 "name": "pt2", 00:14:58.954 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:14:58.954 "is_configured": true, 00:14:58.954 "data_offset": 2048, 00:14:58.954 "data_size": 63488 00:14:58.954 }, 00:14:58.954 { 00:14:58.954 "name": "pt3", 00:14:58.954 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:14:58.954 "is_configured": true, 00:14:58.954 "data_offset": 2048, 00:14:58.954 "data_size": 63488 00:14:58.954 }, 00:14:58.954 { 00:14:58.954 "name": "pt4", 00:14:58.954 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:14:58.954 "is_configured": true, 00:14:58.954 "data_offset": 2048, 00:14:58.954 "data_size": 63488 00:14:58.954 } 00:14:58.954 ] 00:14:58.954 }' 00:14:58.954 04:15:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:14:58.954 04:15:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:59.521 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:14:59.779 [2024-05-15 04:15:47.670222] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:14:59.779 "name": "raid_bdev1", 00:14:59.779 "aliases": [ 00:14:59.779 "7afa61fe-137b-4884-803b-8d11da2e1a81" 00:14:59.779 ], 00:14:59.779 "product_name": "Raid Volume", 00:14:59.779 "block_size": 512, 00:14:59.779 "num_blocks": 253952, 00:14:59.779 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:14:59.779 "assigned_rate_limits": { 00:14:59.779 "rw_ios_per_sec": 0, 00:14:59.779 "rw_mbytes_per_sec": 0, 00:14:59.779 "r_mbytes_per_sec": 0, 00:14:59.779 "w_mbytes_per_sec": 0 00:14:59.779 }, 00:14:59.779 "claimed": false, 00:14:59.779 "zoned": false, 00:14:59.779 "supported_io_types": { 00:14:59.779 "read": true, 00:14:59.779 "write": true, 00:14:59.779 "unmap": true, 00:14:59.779 "write_zeroes": true, 00:14:59.779 "flush": true, 00:14:59.779 "reset": true, 00:14:59.779 "compare": false, 00:14:59.779 "compare_and_write": false, 00:14:59.779 "abort": false, 00:14:59.779 "nvme_admin": false, 00:14:59.779 "nvme_io": false 00:14:59.779 }, 00:14:59.779 "memory_domains": [ 00:14:59.779 { 00:14:59.779 "dma_device_id": "system", 00:14:59.779 "dma_device_type": 1 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.779 "dma_device_type": 2 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "system", 00:14:59.779 "dma_device_type": 1 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.779 "dma_device_type": 2 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "system", 00:14:59.779 "dma_device_type": 1 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.779 "dma_device_type": 2 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "system", 00:14:59.779 "dma_device_type": 1 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.779 "dma_device_type": 2 00:14:59.779 } 00:14:59.779 ], 00:14:59.779 "driver_specific": { 00:14:59.779 "raid": { 00:14:59.779 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:14:59.779 "strip_size_kb": 64, 00:14:59.779 "state": "online", 00:14:59.779 "raid_level": "raid0", 00:14:59.779 "superblock": true, 00:14:59.779 "num_base_bdevs": 4, 00:14:59.779 "num_base_bdevs_discovered": 4, 00:14:59.779 "num_base_bdevs_operational": 4, 00:14:59.779 "base_bdevs_list": [ 00:14:59.779 { 00:14:59.779 "name": "pt1", 00:14:59.779 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:14:59.779 "is_configured": true, 00:14:59.779 "data_offset": 2048, 00:14:59.779 "data_size": 63488 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "name": "pt2", 00:14:59.779 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:14:59.779 "is_configured": true, 00:14:59.779 "data_offset": 2048, 00:14:59.779 "data_size": 63488 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "name": "pt3", 00:14:59.779 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:14:59.779 "is_configured": true, 00:14:59.779 "data_offset": 2048, 00:14:59.779 "data_size": 63488 00:14:59.779 }, 00:14:59.779 { 00:14:59.779 "name": "pt4", 00:14:59.779 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:14:59.779 "is_configured": true, 00:14:59.779 "data_offset": 2048, 00:14:59.779 "data_size": 63488 00:14:59.779 } 00:14:59.779 ] 00:14:59.779 } 00:14:59.779 } 00:14:59.779 }' 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:14:59.779 pt2 00:14:59.779 pt3 00:14:59.779 pt4' 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:59.779 04:15:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:00.037 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:00.037 "name": "pt1", 00:15:00.037 "aliases": [ 00:15:00.037 "84f5011e-77ab-50d3-9457-7415d2d8b070" 00:15:00.037 ], 00:15:00.037 "product_name": "passthru", 00:15:00.037 "block_size": 512, 00:15:00.037 "num_blocks": 65536, 00:15:00.037 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:00.037 "assigned_rate_limits": { 00:15:00.037 "rw_ios_per_sec": 0, 00:15:00.037 "rw_mbytes_per_sec": 0, 00:15:00.037 "r_mbytes_per_sec": 0, 00:15:00.037 "w_mbytes_per_sec": 0 00:15:00.037 }, 00:15:00.037 "claimed": true, 00:15:00.037 "claim_type": "exclusive_write", 00:15:00.037 "zoned": false, 00:15:00.037 "supported_io_types": { 00:15:00.037 "read": true, 00:15:00.037 "write": true, 00:15:00.037 "unmap": true, 00:15:00.037 "write_zeroes": true, 00:15:00.037 "flush": true, 00:15:00.037 "reset": true, 00:15:00.037 "compare": false, 00:15:00.037 "compare_and_write": false, 00:15:00.037 "abort": true, 00:15:00.037 "nvme_admin": false, 00:15:00.037 "nvme_io": false 00:15:00.037 }, 00:15:00.037 "memory_domains": [ 00:15:00.037 { 00:15:00.037 "dma_device_id": "system", 00:15:00.037 "dma_device_type": 1 00:15:00.037 }, 00:15:00.037 { 00:15:00.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.037 "dma_device_type": 2 00:15:00.037 } 00:15:00.037 ], 00:15:00.037 "driver_specific": { 00:15:00.037 "passthru": { 00:15:00.037 "name": "pt1", 00:15:00.037 "base_bdev_name": "malloc1" 00:15:00.037 } 00:15:00.037 } 00:15:00.037 }' 00:15:00.037 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:00.295 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:00.554 "name": "pt2", 00:15:00.554 "aliases": [ 00:15:00.554 "0b3133d8-fe8e-5d3f-857f-7dd468d359b4" 00:15:00.554 ], 00:15:00.554 "product_name": "passthru", 00:15:00.554 "block_size": 512, 00:15:00.554 "num_blocks": 65536, 00:15:00.554 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:00.554 "assigned_rate_limits": { 00:15:00.554 "rw_ios_per_sec": 0, 00:15:00.554 "rw_mbytes_per_sec": 0, 00:15:00.554 "r_mbytes_per_sec": 0, 00:15:00.554 "w_mbytes_per_sec": 0 00:15:00.554 }, 00:15:00.554 "claimed": true, 00:15:00.554 "claim_type": "exclusive_write", 00:15:00.554 "zoned": false, 00:15:00.554 "supported_io_types": { 00:15:00.554 "read": true, 00:15:00.554 "write": true, 00:15:00.554 "unmap": true, 00:15:00.554 "write_zeroes": true, 00:15:00.554 "flush": true, 00:15:00.554 "reset": true, 00:15:00.554 "compare": false, 00:15:00.554 "compare_and_write": false, 00:15:00.554 "abort": true, 00:15:00.554 "nvme_admin": false, 00:15:00.554 "nvme_io": false 00:15:00.554 }, 00:15:00.554 "memory_domains": [ 00:15:00.554 { 00:15:00.554 "dma_device_id": "system", 00:15:00.554 "dma_device_type": 1 00:15:00.554 }, 00:15:00.554 { 00:15:00.554 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.554 "dma_device_type": 2 00:15:00.554 } 00:15:00.554 ], 00:15:00.554 "driver_specific": { 00:15:00.554 "passthru": { 00:15:00.554 "name": "pt2", 00:15:00.554 "base_bdev_name": "malloc2" 00:15:00.554 } 00:15:00.554 } 00:15:00.554 }' 00:15:00.554 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.812 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:00.813 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.813 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:00.813 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.071 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:01.071 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:01.071 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:01.071 04:15:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:01.071 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:01.071 "name": "pt3", 00:15:01.071 "aliases": [ 00:15:01.071 "97a2795c-b9ba-5368-b0e5-34df474b20bc" 00:15:01.071 ], 00:15:01.071 "product_name": "passthru", 00:15:01.071 "block_size": 512, 00:15:01.071 "num_blocks": 65536, 00:15:01.071 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:01.071 "assigned_rate_limits": { 00:15:01.071 "rw_ios_per_sec": 0, 00:15:01.071 "rw_mbytes_per_sec": 0, 00:15:01.071 "r_mbytes_per_sec": 0, 00:15:01.071 "w_mbytes_per_sec": 0 00:15:01.071 }, 00:15:01.071 "claimed": true, 00:15:01.071 "claim_type": "exclusive_write", 00:15:01.071 "zoned": false, 00:15:01.071 "supported_io_types": { 00:15:01.071 "read": true, 00:15:01.071 "write": true, 00:15:01.071 "unmap": true, 00:15:01.071 "write_zeroes": true, 00:15:01.071 "flush": true, 00:15:01.071 "reset": true, 00:15:01.071 "compare": false, 00:15:01.071 "compare_and_write": false, 00:15:01.071 "abort": true, 00:15:01.071 "nvme_admin": false, 00:15:01.071 "nvme_io": false 00:15:01.071 }, 00:15:01.071 "memory_domains": [ 00:15:01.071 { 00:15:01.071 "dma_device_id": "system", 00:15:01.071 "dma_device_type": 1 00:15:01.071 }, 00:15:01.071 { 00:15:01.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.071 "dma_device_type": 2 00:15:01.071 } 00:15:01.071 ], 00:15:01.071 "driver_specific": { 00:15:01.071 "passthru": { 00:15:01.071 "name": "pt3", 00:15:01.071 "base_bdev_name": "malloc3" 00:15:01.071 } 00:15:01.071 } 00:15:01.071 }' 00:15:01.071 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.327 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.585 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:01.585 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:01.585 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:01.585 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:01.585 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:01.585 "name": "pt4", 00:15:01.585 "aliases": [ 00:15:01.585 "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726" 00:15:01.585 ], 00:15:01.585 "product_name": "passthru", 00:15:01.585 "block_size": 512, 00:15:01.585 "num_blocks": 65536, 00:15:01.585 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:01.585 "assigned_rate_limits": { 00:15:01.585 "rw_ios_per_sec": 0, 00:15:01.585 "rw_mbytes_per_sec": 0, 00:15:01.585 "r_mbytes_per_sec": 0, 00:15:01.586 "w_mbytes_per_sec": 0 00:15:01.586 }, 00:15:01.586 "claimed": true, 00:15:01.586 "claim_type": "exclusive_write", 00:15:01.586 "zoned": false, 00:15:01.586 "supported_io_types": { 00:15:01.586 "read": true, 00:15:01.586 "write": true, 00:15:01.586 "unmap": true, 00:15:01.586 "write_zeroes": true, 00:15:01.586 "flush": true, 00:15:01.586 "reset": true, 00:15:01.586 "compare": false, 00:15:01.586 "compare_and_write": false, 00:15:01.586 "abort": true, 00:15:01.586 "nvme_admin": false, 00:15:01.586 "nvme_io": false 00:15:01.586 }, 00:15:01.586 "memory_domains": [ 00:15:01.586 { 00:15:01.586 "dma_device_id": "system", 00:15:01.586 "dma_device_type": 1 00:15:01.586 }, 00:15:01.586 { 00:15:01.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:01.586 "dma_device_type": 2 00:15:01.586 } 00:15:01.586 ], 00:15:01.586 "driver_specific": { 00:15:01.586 "passthru": { 00:15:01.586 "name": "pt4", 00:15:01.586 "base_bdev_name": "malloc4" 00:15:01.586 } 00:15:01.586 } 00:15:01.586 }' 00:15:01.586 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:01.843 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:02.100 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:02.100 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:02.100 04:15:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:15:02.100 [2024-05-15 04:15:50.088621] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:02.101 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=7afa61fe-137b-4884-803b-8d11da2e1a81 00:15:02.101 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 7afa61fe-137b-4884-803b-8d11da2e1a81 ']' 00:15:02.101 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:02.667 [2024-05-15 04:15:50.377152] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:02.667 [2024-05-15 04:15:50.377182] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:02.667 [2024-05-15 04:15:50.377278] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:02.667 [2024-05-15 04:15:50.377346] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:02.667 [2024-05-15 04:15:50.377360] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa1cc0 name raid_bdev1, state offline 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:02.667 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:02.925 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:02.925 04:15:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:03.184 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:03.184 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:03.442 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:15:03.442 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:15:03.700 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:03.700 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:03.958 04:15:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:15:04.216 [2024-05-15 04:15:52.125765] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:04.216 [2024-05-15 04:15:52.127133] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:04.216 [2024-05-15 04:15:52.127178] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:04.216 [2024-05-15 04:15:52.127230] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:15:04.216 [2024-05-15 04:15:52.127291] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:04.216 [2024-05-15 04:15:52.127342] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:04.216 [2024-05-15 04:15:52.127368] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:04.216 [2024-05-15 04:15:52.127399] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:15:04.216 [2024-05-15 04:15:52.127418] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:04.216 [2024-05-15 04:15:52.127429] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa3aa0 name raid_bdev1, state configuring 00:15:04.216 request: 00:15:04.216 { 00:15:04.216 "name": "raid_bdev1", 00:15:04.216 "raid_level": "raid0", 00:15:04.216 "base_bdevs": [ 00:15:04.216 "malloc1", 00:15:04.216 "malloc2", 00:15:04.216 "malloc3", 00:15:04.216 "malloc4" 00:15:04.216 ], 00:15:04.216 "superblock": false, 00:15:04.216 "strip_size_kb": 64, 00:15:04.217 "method": "bdev_raid_create", 00:15:04.217 "req_id": 1 00:15:04.217 } 00:15:04.217 Got JSON-RPC error response 00:15:04.217 response: 00:15:04.217 { 00:15:04.217 "code": -17, 00:15:04.217 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:04.217 } 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.217 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:15:04.475 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:15:04.475 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:15:04.475 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:04.733 [2024-05-15 04:15:52.610986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:04.733 [2024-05-15 04:15:52.611046] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:04.733 [2024-05-15 04:15:52.611072] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa4a80 00:15:04.733 [2024-05-15 04:15:52.611086] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:04.733 [2024-05-15 04:15:52.612616] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:04.733 [2024-05-15 04:15:52.612640] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:04.733 [2024-05-15 04:15:52.612722] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:04.733 [2024-05-15 04:15:52.612759] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:04.733 pt1 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.733 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:04.991 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:04.991 "name": "raid_bdev1", 00:15:04.991 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:15:04.991 "strip_size_kb": 64, 00:15:04.991 "state": "configuring", 00:15:04.991 "raid_level": "raid0", 00:15:04.991 "superblock": true, 00:15:04.991 "num_base_bdevs": 4, 00:15:04.991 "num_base_bdevs_discovered": 1, 00:15:04.991 "num_base_bdevs_operational": 4, 00:15:04.991 "base_bdevs_list": [ 00:15:04.991 { 00:15:04.991 "name": "pt1", 00:15:04.991 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:04.991 "is_configured": true, 00:15:04.991 "data_offset": 2048, 00:15:04.991 "data_size": 63488 00:15:04.991 }, 00:15:04.991 { 00:15:04.991 "name": null, 00:15:04.991 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:04.991 "is_configured": false, 00:15:04.991 "data_offset": 2048, 00:15:04.991 "data_size": 63488 00:15:04.991 }, 00:15:04.991 { 00:15:04.991 "name": null, 00:15:04.991 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:04.991 "is_configured": false, 00:15:04.991 "data_offset": 2048, 00:15:04.991 "data_size": 63488 00:15:04.991 }, 00:15:04.991 { 00:15:04.991 "name": null, 00:15:04.991 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:04.991 "is_configured": false, 00:15:04.991 "data_offset": 2048, 00:15:04.991 "data_size": 63488 00:15:04.991 } 00:15:04.991 ] 00:15:04.991 }' 00:15:04.991 04:15:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:04.991 04:15:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.554 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:15:05.554 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:05.811 [2024-05-15 04:15:53.629704] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:05.811 [2024-05-15 04:15:53.629769] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:05.811 [2024-05-15 04:15:53.629794] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa0570 00:15:05.811 [2024-05-15 04:15:53.629821] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:05.811 [2024-05-15 04:15:53.630215] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:05.811 [2024-05-15 04:15:53.630236] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:05.811 [2024-05-15 04:15:53.630312] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:05.811 [2024-05-15 04:15:53.630336] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:05.811 pt2 00:15:05.811 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:06.068 [2024-05-15 04:15:53.874346] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.068 04:15:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:06.325 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:06.325 "name": "raid_bdev1", 00:15:06.325 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:15:06.325 "strip_size_kb": 64, 00:15:06.325 "state": "configuring", 00:15:06.325 "raid_level": "raid0", 00:15:06.325 "superblock": true, 00:15:06.325 "num_base_bdevs": 4, 00:15:06.325 "num_base_bdevs_discovered": 1, 00:15:06.325 "num_base_bdevs_operational": 4, 00:15:06.325 "base_bdevs_list": [ 00:15:06.325 { 00:15:06.325 "name": "pt1", 00:15:06.325 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:06.325 "is_configured": true, 00:15:06.325 "data_offset": 2048, 00:15:06.325 "data_size": 63488 00:15:06.325 }, 00:15:06.325 { 00:15:06.325 "name": null, 00:15:06.325 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:06.325 "is_configured": false, 00:15:06.325 "data_offset": 2048, 00:15:06.325 "data_size": 63488 00:15:06.325 }, 00:15:06.325 { 00:15:06.325 "name": null, 00:15:06.325 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:06.325 "is_configured": false, 00:15:06.325 "data_offset": 2048, 00:15:06.325 "data_size": 63488 00:15:06.325 }, 00:15:06.325 { 00:15:06.325 "name": null, 00:15:06.325 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:06.325 "is_configured": false, 00:15:06.325 "data_offset": 2048, 00:15:06.325 "data_size": 63488 00:15:06.325 } 00:15:06.325 ] 00:15:06.325 }' 00:15:06.325 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:06.325 04:15:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.890 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:15:06.890 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:06.890 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:07.148 [2024-05-15 04:15:54.921140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:07.148 [2024-05-15 04:15:54.921216] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.148 [2024-05-15 04:15:54.921241] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa47e0 00:15:07.148 [2024-05-15 04:15:54.921253] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.148 [2024-05-15 04:15:54.921600] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.148 [2024-05-15 04:15:54.921620] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:07.148 [2024-05-15 04:15:54.921695] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:07.148 [2024-05-15 04:15:54.921720] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:07.148 pt2 00:15:07.148 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:15:07.148 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:07.148 04:15:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:07.405 [2024-05-15 04:15:55.213892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:07.405 [2024-05-15 04:15:55.213931] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.405 [2024-05-15 04:15:55.213952] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa3370 00:15:07.405 [2024-05-15 04:15:55.213966] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.405 [2024-05-15 04:15:55.214258] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.405 [2024-05-15 04:15:55.214284] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:07.405 [2024-05-15 04:15:55.214343] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:07.405 [2024-05-15 04:15:55.214369] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:07.405 pt3 00:15:07.405 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:15:07.405 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:07.405 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:15:07.662 [2024-05-15 04:15:55.506678] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:15:07.662 [2024-05-15 04:15:55.506736] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:07.662 [2024-05-15 04:15:55.506764] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fa75e0 00:15:07.662 [2024-05-15 04:15:55.506780] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:07.662 [2024-05-15 04:15:55.507195] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:07.662 [2024-05-15 04:15:55.507223] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:15:07.662 [2024-05-15 04:15:55.507307] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:15:07.662 [2024-05-15 04:15:55.507337] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:15:07.662 [2024-05-15 04:15:55.507491] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fa4f20 00:15:07.662 [2024-05-15 04:15:55.507507] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:07.662 [2024-05-15 04:15:55.507684] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d72b0 00:15:07.662 [2024-05-15 04:15:55.507847] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fa4f20 00:15:07.662 [2024-05-15 04:15:55.507863] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fa4f20 00:15:07.662 [2024-05-15 04:15:55.507973] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:07.662 pt4 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid0 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.662 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:07.920 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:07.920 "name": "raid_bdev1", 00:15:07.920 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:15:07.920 "strip_size_kb": 64, 00:15:07.920 "state": "online", 00:15:07.920 "raid_level": "raid0", 00:15:07.920 "superblock": true, 00:15:07.920 "num_base_bdevs": 4, 00:15:07.920 "num_base_bdevs_discovered": 4, 00:15:07.920 "num_base_bdevs_operational": 4, 00:15:07.920 "base_bdevs_list": [ 00:15:07.920 { 00:15:07.920 "name": "pt1", 00:15:07.920 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 }, 00:15:07.920 { 00:15:07.920 "name": "pt2", 00:15:07.920 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 }, 00:15:07.920 { 00:15:07.920 "name": "pt3", 00:15:07.920 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 }, 00:15:07.920 { 00:15:07.920 "name": "pt4", 00:15:07.920 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:07.920 "is_configured": true, 00:15:07.920 "data_offset": 2048, 00:15:07.920 "data_size": 63488 00:15:07.920 } 00:15:07.920 ] 00:15:07.920 }' 00:15:07.920 04:15:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:07.920 04:15:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:08.485 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:08.743 [2024-05-15 04:15:56.545685] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:08.743 "name": "raid_bdev1", 00:15:08.743 "aliases": [ 00:15:08.743 "7afa61fe-137b-4884-803b-8d11da2e1a81" 00:15:08.743 ], 00:15:08.743 "product_name": "Raid Volume", 00:15:08.743 "block_size": 512, 00:15:08.743 "num_blocks": 253952, 00:15:08.743 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:15:08.743 "assigned_rate_limits": { 00:15:08.743 "rw_ios_per_sec": 0, 00:15:08.743 "rw_mbytes_per_sec": 0, 00:15:08.743 "r_mbytes_per_sec": 0, 00:15:08.743 "w_mbytes_per_sec": 0 00:15:08.743 }, 00:15:08.743 "claimed": false, 00:15:08.743 "zoned": false, 00:15:08.743 "supported_io_types": { 00:15:08.743 "read": true, 00:15:08.743 "write": true, 00:15:08.743 "unmap": true, 00:15:08.743 "write_zeroes": true, 00:15:08.743 "flush": true, 00:15:08.743 "reset": true, 00:15:08.743 "compare": false, 00:15:08.743 "compare_and_write": false, 00:15:08.743 "abort": false, 00:15:08.743 "nvme_admin": false, 00:15:08.743 "nvme_io": false 00:15:08.743 }, 00:15:08.743 "memory_domains": [ 00:15:08.743 { 00:15:08.743 "dma_device_id": "system", 00:15:08.743 "dma_device_type": 1 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.743 "dma_device_type": 2 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "system", 00:15:08.743 "dma_device_type": 1 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.743 "dma_device_type": 2 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "system", 00:15:08.743 "dma_device_type": 1 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.743 "dma_device_type": 2 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "system", 00:15:08.743 "dma_device_type": 1 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:08.743 "dma_device_type": 2 00:15:08.743 } 00:15:08.743 ], 00:15:08.743 "driver_specific": { 00:15:08.743 "raid": { 00:15:08.743 "uuid": "7afa61fe-137b-4884-803b-8d11da2e1a81", 00:15:08.743 "strip_size_kb": 64, 00:15:08.743 "state": "online", 00:15:08.743 "raid_level": "raid0", 00:15:08.743 "superblock": true, 00:15:08.743 "num_base_bdevs": 4, 00:15:08.743 "num_base_bdevs_discovered": 4, 00:15:08.743 "num_base_bdevs_operational": 4, 00:15:08.743 "base_bdevs_list": [ 00:15:08.743 { 00:15:08.743 "name": "pt1", 00:15:08.743 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:08.743 "is_configured": true, 00:15:08.743 "data_offset": 2048, 00:15:08.743 "data_size": 63488 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "name": "pt2", 00:15:08.743 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:08.743 "is_configured": true, 00:15:08.743 "data_offset": 2048, 00:15:08.743 "data_size": 63488 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "name": "pt3", 00:15:08.743 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:08.743 "is_configured": true, 00:15:08.743 "data_offset": 2048, 00:15:08.743 "data_size": 63488 00:15:08.743 }, 00:15:08.743 { 00:15:08.743 "name": "pt4", 00:15:08.743 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:08.743 "is_configured": true, 00:15:08.743 "data_offset": 2048, 00:15:08.743 "data_size": 63488 00:15:08.743 } 00:15:08.743 ] 00:15:08.743 } 00:15:08.743 } 00:15:08.743 }' 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:15:08.743 pt2 00:15:08.743 pt3 00:15:08.743 pt4' 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:08.743 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:09.000 "name": "pt1", 00:15:09.000 "aliases": [ 00:15:09.000 "84f5011e-77ab-50d3-9457-7415d2d8b070" 00:15:09.000 ], 00:15:09.000 "product_name": "passthru", 00:15:09.000 "block_size": 512, 00:15:09.000 "num_blocks": 65536, 00:15:09.000 "uuid": "84f5011e-77ab-50d3-9457-7415d2d8b070", 00:15:09.000 "assigned_rate_limits": { 00:15:09.000 "rw_ios_per_sec": 0, 00:15:09.000 "rw_mbytes_per_sec": 0, 00:15:09.000 "r_mbytes_per_sec": 0, 00:15:09.000 "w_mbytes_per_sec": 0 00:15:09.000 }, 00:15:09.000 "claimed": true, 00:15:09.000 "claim_type": "exclusive_write", 00:15:09.000 "zoned": false, 00:15:09.000 "supported_io_types": { 00:15:09.000 "read": true, 00:15:09.000 "write": true, 00:15:09.000 "unmap": true, 00:15:09.000 "write_zeroes": true, 00:15:09.000 "flush": true, 00:15:09.000 "reset": true, 00:15:09.000 "compare": false, 00:15:09.000 "compare_and_write": false, 00:15:09.000 "abort": true, 00:15:09.000 "nvme_admin": false, 00:15:09.000 "nvme_io": false 00:15:09.000 }, 00:15:09.000 "memory_domains": [ 00:15:09.000 { 00:15:09.000 "dma_device_id": "system", 00:15:09.000 "dma_device_type": 1 00:15:09.000 }, 00:15:09.000 { 00:15:09.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.000 "dma_device_type": 2 00:15:09.000 } 00:15:09.000 ], 00:15:09.000 "driver_specific": { 00:15:09.000 "passthru": { 00:15:09.000 "name": "pt1", 00:15:09.000 "base_bdev_name": "malloc1" 00:15:09.000 } 00:15:09.000 } 00:15:09.000 }' 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:09.000 04:15:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:09.000 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.000 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:09.258 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:09.516 "name": "pt2", 00:15:09.516 "aliases": [ 00:15:09.516 "0b3133d8-fe8e-5d3f-857f-7dd468d359b4" 00:15:09.516 ], 00:15:09.516 "product_name": "passthru", 00:15:09.516 "block_size": 512, 00:15:09.516 "num_blocks": 65536, 00:15:09.516 "uuid": "0b3133d8-fe8e-5d3f-857f-7dd468d359b4", 00:15:09.516 "assigned_rate_limits": { 00:15:09.516 "rw_ios_per_sec": 0, 00:15:09.516 "rw_mbytes_per_sec": 0, 00:15:09.516 "r_mbytes_per_sec": 0, 00:15:09.516 "w_mbytes_per_sec": 0 00:15:09.516 }, 00:15:09.516 "claimed": true, 00:15:09.516 "claim_type": "exclusive_write", 00:15:09.516 "zoned": false, 00:15:09.516 "supported_io_types": { 00:15:09.516 "read": true, 00:15:09.516 "write": true, 00:15:09.516 "unmap": true, 00:15:09.516 "write_zeroes": true, 00:15:09.516 "flush": true, 00:15:09.516 "reset": true, 00:15:09.516 "compare": false, 00:15:09.516 "compare_and_write": false, 00:15:09.516 "abort": true, 00:15:09.516 "nvme_admin": false, 00:15:09.516 "nvme_io": false 00:15:09.516 }, 00:15:09.516 "memory_domains": [ 00:15:09.516 { 00:15:09.516 "dma_device_id": "system", 00:15:09.516 "dma_device_type": 1 00:15:09.516 }, 00:15:09.516 { 00:15:09.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.516 "dma_device_type": 2 00:15:09.516 } 00:15:09.516 ], 00:15:09.516 "driver_specific": { 00:15:09.516 "passthru": { 00:15:09.516 "name": "pt2", 00:15:09.516 "base_bdev_name": "malloc2" 00:15:09.516 } 00:15:09.516 } 00:15:09.516 }' 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:09.516 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:09.774 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:10.031 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:10.031 "name": "pt3", 00:15:10.031 "aliases": [ 00:15:10.031 "97a2795c-b9ba-5368-b0e5-34df474b20bc" 00:15:10.031 ], 00:15:10.031 "product_name": "passthru", 00:15:10.031 "block_size": 512, 00:15:10.031 "num_blocks": 65536, 00:15:10.031 "uuid": "97a2795c-b9ba-5368-b0e5-34df474b20bc", 00:15:10.031 "assigned_rate_limits": { 00:15:10.031 "rw_ios_per_sec": 0, 00:15:10.031 "rw_mbytes_per_sec": 0, 00:15:10.031 "r_mbytes_per_sec": 0, 00:15:10.031 "w_mbytes_per_sec": 0 00:15:10.031 }, 00:15:10.031 "claimed": true, 00:15:10.031 "claim_type": "exclusive_write", 00:15:10.031 "zoned": false, 00:15:10.031 "supported_io_types": { 00:15:10.031 "read": true, 00:15:10.031 "write": true, 00:15:10.031 "unmap": true, 00:15:10.031 "write_zeroes": true, 00:15:10.031 "flush": true, 00:15:10.031 "reset": true, 00:15:10.031 "compare": false, 00:15:10.031 "compare_and_write": false, 00:15:10.031 "abort": true, 00:15:10.031 "nvme_admin": false, 00:15:10.031 "nvme_io": false 00:15:10.031 }, 00:15:10.031 "memory_domains": [ 00:15:10.031 { 00:15:10.031 "dma_device_id": "system", 00:15:10.031 "dma_device_type": 1 00:15:10.031 }, 00:15:10.031 { 00:15:10.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.031 "dma_device_type": 2 00:15:10.031 } 00:15:10.031 ], 00:15:10.031 "driver_specific": { 00:15:10.031 "passthru": { 00:15:10.031 "name": "pt3", 00:15:10.031 "base_bdev_name": "malloc3" 00:15:10.031 } 00:15:10.031 } 00:15:10.031 }' 00:15:10.031 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:10.031 04:15:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:10.031 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:10.031 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:15:10.289 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:10.546 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:10.546 "name": "pt4", 00:15:10.546 "aliases": [ 00:15:10.546 "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726" 00:15:10.546 ], 00:15:10.546 "product_name": "passthru", 00:15:10.546 "block_size": 512, 00:15:10.546 "num_blocks": 65536, 00:15:10.546 "uuid": "0c90c30f-4f2a-5aa7-a23d-70aef1cdb726", 00:15:10.546 "assigned_rate_limits": { 00:15:10.546 "rw_ios_per_sec": 0, 00:15:10.546 "rw_mbytes_per_sec": 0, 00:15:10.546 "r_mbytes_per_sec": 0, 00:15:10.546 "w_mbytes_per_sec": 0 00:15:10.546 }, 00:15:10.546 "claimed": true, 00:15:10.546 "claim_type": "exclusive_write", 00:15:10.546 "zoned": false, 00:15:10.547 "supported_io_types": { 00:15:10.547 "read": true, 00:15:10.547 "write": true, 00:15:10.547 "unmap": true, 00:15:10.547 "write_zeroes": true, 00:15:10.547 "flush": true, 00:15:10.547 "reset": true, 00:15:10.547 "compare": false, 00:15:10.547 "compare_and_write": false, 00:15:10.547 "abort": true, 00:15:10.547 "nvme_admin": false, 00:15:10.547 "nvme_io": false 00:15:10.547 }, 00:15:10.547 "memory_domains": [ 00:15:10.547 { 00:15:10.547 "dma_device_id": "system", 00:15:10.547 "dma_device_type": 1 00:15:10.547 }, 00:15:10.547 { 00:15:10.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:10.547 "dma_device_type": 2 00:15:10.547 } 00:15:10.547 ], 00:15:10.547 "driver_specific": { 00:15:10.547 "passthru": { 00:15:10.547 "name": "pt4", 00:15:10.547 "base_bdev_name": "malloc4" 00:15:10.547 } 00:15:10.547 } 00:15:10.547 }' 00:15:10.547 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:10.547 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:10.804 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:10.805 04:15:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:15:11.062 [2024-05-15 04:15:59.016265] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:11.062 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 7afa61fe-137b-4884-803b-8d11da2e1a81 '!=' 7afa61fe-137b-4884-803b-8d11da2e1a81 ']' 00:15:11.062 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid0 00:15:11.062 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:11.062 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3878924 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3878924 ']' 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3878924 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3878924 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3878924' 00:15:11.063 killing process with pid 3878924 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3878924 00:15:11.063 [2024-05-15 04:15:59.064115] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:11.063 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3878924 00:15:11.063 [2024-05-15 04:15:59.064218] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:11.063 [2024-05-15 04:15:59.064296] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:11.063 [2024-05-15 04:15:59.064312] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fa4f20 name raid_bdev1, state offline 00:15:11.320 [2024-05-15 04:15:59.108087] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:11.578 04:15:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:15:11.578 00:15:11.578 real 0m16.062s 00:15:11.578 user 0m29.553s 00:15:11.578 sys 0m2.137s 00:15:11.578 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:11.578 04:15:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.578 ************************************ 00:15:11.578 END TEST raid_superblock_test 00:15:11.578 ************************************ 00:15:11.578 04:15:59 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:15:11.578 04:15:59 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:15:11.578 04:15:59 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:15:11.578 04:15:59 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:11.578 04:15:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:11.578 ************************************ 00:15:11.578 START TEST raid_state_function_test 00:15:11.578 ************************************ 00:15:11.578 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 false 00:15:11.578 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3881231 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3881231' 00:15:11.579 Process raid pid: 3881231 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3881231 /var/tmp/spdk-raid.sock 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3881231 ']' 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:11.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:11.579 04:15:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.579 [2024-05-15 04:15:59.477611] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:15:11.579 [2024-05-15 04:15:59.477681] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:11.579 [2024-05-15 04:15:59.554359] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:11.837 [2024-05-15 04:15:59.666009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:11.837 [2024-05-15 04:15:59.737546] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:11.837 [2024-05-15 04:15:59.737600] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:12.770 [2024-05-15 04:16:00.701259] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:12.770 [2024-05-15 04:16:00.701298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:12.770 [2024-05-15 04:16:00.701308] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:12.770 [2024-05-15 04:16:00.701319] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:12.770 [2024-05-15 04:16:00.701326] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:12.770 [2024-05-15 04:16:00.701336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:12.770 [2024-05-15 04:16:00.701343] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:12.770 [2024-05-15 04:16:00.701352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.770 04:16:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.028 04:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:13.028 "name": "Existed_Raid", 00:15:13.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.028 "strip_size_kb": 64, 00:15:13.028 "state": "configuring", 00:15:13.028 "raid_level": "concat", 00:15:13.028 "superblock": false, 00:15:13.028 "num_base_bdevs": 4, 00:15:13.028 "num_base_bdevs_discovered": 0, 00:15:13.028 "num_base_bdevs_operational": 4, 00:15:13.028 "base_bdevs_list": [ 00:15:13.028 { 00:15:13.028 "name": "BaseBdev1", 00:15:13.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.028 "is_configured": false, 00:15:13.028 "data_offset": 0, 00:15:13.028 "data_size": 0 00:15:13.028 }, 00:15:13.028 { 00:15:13.028 "name": "BaseBdev2", 00:15:13.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.028 "is_configured": false, 00:15:13.028 "data_offset": 0, 00:15:13.028 "data_size": 0 00:15:13.028 }, 00:15:13.028 { 00:15:13.028 "name": "BaseBdev3", 00:15:13.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.028 "is_configured": false, 00:15:13.028 "data_offset": 0, 00:15:13.028 "data_size": 0 00:15:13.028 }, 00:15:13.028 { 00:15:13.028 "name": "BaseBdev4", 00:15:13.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.028 "is_configured": false, 00:15:13.028 "data_offset": 0, 00:15:13.028 "data_size": 0 00:15:13.028 } 00:15:13.028 ] 00:15:13.028 }' 00:15:13.028 04:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:13.028 04:16:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:13.593 04:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:13.849 [2024-05-15 04:16:01.783991] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:13.850 [2024-05-15 04:16:01.784024] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e7040 name Existed_Raid, state configuring 00:15:13.850 04:16:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:14.139 [2024-05-15 04:16:02.036673] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:14.139 [2024-05-15 04:16:02.036714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:14.139 [2024-05-15 04:16:02.036736] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:14.139 [2024-05-15 04:16:02.036746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:14.139 [2024-05-15 04:16:02.036755] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:14.139 [2024-05-15 04:16:02.036767] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:14.139 [2024-05-15 04:16:02.036775] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:14.139 [2024-05-15 04:16:02.036786] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:14.139 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:14.422 [2024-05-15 04:16:02.304566] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:14.422 BaseBdev1 00:15:14.422 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:15:14.422 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:14.422 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:14.422 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:14.422 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:14.423 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:14.423 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.680 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:14.938 [ 00:15:14.938 { 00:15:14.938 "name": "BaseBdev1", 00:15:14.938 "aliases": [ 00:15:14.938 "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef" 00:15:14.938 ], 00:15:14.938 "product_name": "Malloc disk", 00:15:14.938 "block_size": 512, 00:15:14.938 "num_blocks": 65536, 00:15:14.938 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:14.938 "assigned_rate_limits": { 00:15:14.938 "rw_ios_per_sec": 0, 00:15:14.938 "rw_mbytes_per_sec": 0, 00:15:14.938 "r_mbytes_per_sec": 0, 00:15:14.938 "w_mbytes_per_sec": 0 00:15:14.938 }, 00:15:14.938 "claimed": true, 00:15:14.938 "claim_type": "exclusive_write", 00:15:14.938 "zoned": false, 00:15:14.938 "supported_io_types": { 00:15:14.938 "read": true, 00:15:14.938 "write": true, 00:15:14.938 "unmap": true, 00:15:14.938 "write_zeroes": true, 00:15:14.938 "flush": true, 00:15:14.938 "reset": true, 00:15:14.938 "compare": false, 00:15:14.938 "compare_and_write": false, 00:15:14.938 "abort": true, 00:15:14.938 "nvme_admin": false, 00:15:14.938 "nvme_io": false 00:15:14.938 }, 00:15:14.938 "memory_domains": [ 00:15:14.938 { 00:15:14.938 "dma_device_id": "system", 00:15:14.938 "dma_device_type": 1 00:15:14.938 }, 00:15:14.938 { 00:15:14.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.938 "dma_device_type": 2 00:15:14.938 } 00:15:14.938 ], 00:15:14.938 "driver_specific": {} 00:15:14.938 } 00:15:14.938 ] 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.938 04:16:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:15.196 04:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:15.196 "name": "Existed_Raid", 00:15:15.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.196 "strip_size_kb": 64, 00:15:15.196 "state": "configuring", 00:15:15.196 "raid_level": "concat", 00:15:15.196 "superblock": false, 00:15:15.196 "num_base_bdevs": 4, 00:15:15.196 "num_base_bdevs_discovered": 1, 00:15:15.196 "num_base_bdevs_operational": 4, 00:15:15.196 "base_bdevs_list": [ 00:15:15.196 { 00:15:15.196 "name": "BaseBdev1", 00:15:15.196 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:15.196 "is_configured": true, 00:15:15.196 "data_offset": 0, 00:15:15.196 "data_size": 65536 00:15:15.196 }, 00:15:15.196 { 00:15:15.196 "name": "BaseBdev2", 00:15:15.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.196 "is_configured": false, 00:15:15.196 "data_offset": 0, 00:15:15.196 "data_size": 0 00:15:15.196 }, 00:15:15.196 { 00:15:15.196 "name": "BaseBdev3", 00:15:15.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.196 "is_configured": false, 00:15:15.196 "data_offset": 0, 00:15:15.196 "data_size": 0 00:15:15.196 }, 00:15:15.196 { 00:15:15.196 "name": "BaseBdev4", 00:15:15.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:15.196 "is_configured": false, 00:15:15.196 "data_offset": 0, 00:15:15.196 "data_size": 0 00:15:15.196 } 00:15:15.196 ] 00:15:15.196 }' 00:15:15.196 04:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:15.196 04:16:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.763 04:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:16.020 [2024-05-15 04:16:03.856646] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:16.020 [2024-05-15 04:16:03.856696] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e68b0 name Existed_Raid, state configuring 00:15:16.020 04:16:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:16.278 [2024-05-15 04:16:04.093302] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:16.278 [2024-05-15 04:16:04.094795] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:16.278 [2024-05-15 04:16:04.094839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:16.278 [2024-05-15 04:16:04.094863] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:16.278 [2024-05-15 04:16:04.094876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:16.278 [2024-05-15 04:16:04.094886] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:16.278 [2024-05-15 04:16:04.094898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.278 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.536 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:16.536 "name": "Existed_Raid", 00:15:16.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.536 "strip_size_kb": 64, 00:15:16.536 "state": "configuring", 00:15:16.536 "raid_level": "concat", 00:15:16.536 "superblock": false, 00:15:16.536 "num_base_bdevs": 4, 00:15:16.536 "num_base_bdevs_discovered": 1, 00:15:16.536 "num_base_bdevs_operational": 4, 00:15:16.536 "base_bdevs_list": [ 00:15:16.536 { 00:15:16.536 "name": "BaseBdev1", 00:15:16.536 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:16.536 "is_configured": true, 00:15:16.536 "data_offset": 0, 00:15:16.536 "data_size": 65536 00:15:16.536 }, 00:15:16.536 { 00:15:16.536 "name": "BaseBdev2", 00:15:16.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.536 "is_configured": false, 00:15:16.536 "data_offset": 0, 00:15:16.536 "data_size": 0 00:15:16.536 }, 00:15:16.536 { 00:15:16.536 "name": "BaseBdev3", 00:15:16.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.536 "is_configured": false, 00:15:16.536 "data_offset": 0, 00:15:16.536 "data_size": 0 00:15:16.536 }, 00:15:16.536 { 00:15:16.536 "name": "BaseBdev4", 00:15:16.536 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.536 "is_configured": false, 00:15:16.536 "data_offset": 0, 00:15:16.536 "data_size": 0 00:15:16.536 } 00:15:16.536 ] 00:15:16.536 }' 00:15:16.536 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:16.536 04:16:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.102 04:16:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:17.360 [2024-05-15 04:16:05.121388] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:17.360 BaseBdev2 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:17.360 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:17.619 [ 00:15:17.619 { 00:15:17.619 "name": "BaseBdev2", 00:15:17.619 "aliases": [ 00:15:17.619 "30ad46ee-cded-49bf-bbb2-8309dee82223" 00:15:17.619 ], 00:15:17.619 "product_name": "Malloc disk", 00:15:17.619 "block_size": 512, 00:15:17.619 "num_blocks": 65536, 00:15:17.619 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:17.619 "assigned_rate_limits": { 00:15:17.619 "rw_ios_per_sec": 0, 00:15:17.619 "rw_mbytes_per_sec": 0, 00:15:17.619 "r_mbytes_per_sec": 0, 00:15:17.619 "w_mbytes_per_sec": 0 00:15:17.619 }, 00:15:17.619 "claimed": true, 00:15:17.619 "claim_type": "exclusive_write", 00:15:17.619 "zoned": false, 00:15:17.619 "supported_io_types": { 00:15:17.619 "read": true, 00:15:17.619 "write": true, 00:15:17.619 "unmap": true, 00:15:17.619 "write_zeroes": true, 00:15:17.619 "flush": true, 00:15:17.619 "reset": true, 00:15:17.619 "compare": false, 00:15:17.619 "compare_and_write": false, 00:15:17.619 "abort": true, 00:15:17.619 "nvme_admin": false, 00:15:17.619 "nvme_io": false 00:15:17.619 }, 00:15:17.619 "memory_domains": [ 00:15:17.619 { 00:15:17.619 "dma_device_id": "system", 00:15:17.619 "dma_device_type": 1 00:15:17.619 }, 00:15:17.619 { 00:15:17.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.619 "dma_device_type": 2 00:15:17.619 } 00:15:17.619 ], 00:15:17.619 "driver_specific": {} 00:15:17.619 } 00:15:17.619 ] 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:17.619 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.620 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.878 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:17.878 "name": "Existed_Raid", 00:15:17.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.878 "strip_size_kb": 64, 00:15:17.878 "state": "configuring", 00:15:17.878 "raid_level": "concat", 00:15:17.878 "superblock": false, 00:15:17.878 "num_base_bdevs": 4, 00:15:17.878 "num_base_bdevs_discovered": 2, 00:15:17.878 "num_base_bdevs_operational": 4, 00:15:17.878 "base_bdevs_list": [ 00:15:17.878 { 00:15:17.878 "name": "BaseBdev1", 00:15:17.878 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:17.878 "is_configured": true, 00:15:17.878 "data_offset": 0, 00:15:17.878 "data_size": 65536 00:15:17.878 }, 00:15:17.878 { 00:15:17.878 "name": "BaseBdev2", 00:15:17.878 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:17.878 "is_configured": true, 00:15:17.878 "data_offset": 0, 00:15:17.878 "data_size": 65536 00:15:17.878 }, 00:15:17.878 { 00:15:17.878 "name": "BaseBdev3", 00:15:17.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.878 "is_configured": false, 00:15:17.878 "data_offset": 0, 00:15:17.879 "data_size": 0 00:15:17.879 }, 00:15:17.879 { 00:15:17.879 "name": "BaseBdev4", 00:15:17.879 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.879 "is_configured": false, 00:15:17.879 "data_offset": 0, 00:15:17.879 "data_size": 0 00:15:17.879 } 00:15:17.879 ] 00:15:17.879 }' 00:15:17.879 04:16:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:17.879 04:16:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.444 04:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:18.702 [2024-05-15 04:16:06.666765] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:18.702 BaseBdev3 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:18.702 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:18.960 04:16:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:19.218 [ 00:15:19.218 { 00:15:19.218 "name": "BaseBdev3", 00:15:19.218 "aliases": [ 00:15:19.218 "75e6d1be-292d-4711-9911-6b58fa46591e" 00:15:19.218 ], 00:15:19.218 "product_name": "Malloc disk", 00:15:19.218 "block_size": 512, 00:15:19.218 "num_blocks": 65536, 00:15:19.218 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:19.218 "assigned_rate_limits": { 00:15:19.218 "rw_ios_per_sec": 0, 00:15:19.218 "rw_mbytes_per_sec": 0, 00:15:19.218 "r_mbytes_per_sec": 0, 00:15:19.218 "w_mbytes_per_sec": 0 00:15:19.218 }, 00:15:19.218 "claimed": true, 00:15:19.218 "claim_type": "exclusive_write", 00:15:19.218 "zoned": false, 00:15:19.218 "supported_io_types": { 00:15:19.218 "read": true, 00:15:19.218 "write": true, 00:15:19.218 "unmap": true, 00:15:19.218 "write_zeroes": true, 00:15:19.218 "flush": true, 00:15:19.218 "reset": true, 00:15:19.218 "compare": false, 00:15:19.218 "compare_and_write": false, 00:15:19.218 "abort": true, 00:15:19.218 "nvme_admin": false, 00:15:19.218 "nvme_io": false 00:15:19.218 }, 00:15:19.218 "memory_domains": [ 00:15:19.218 { 00:15:19.218 "dma_device_id": "system", 00:15:19.218 "dma_device_type": 1 00:15:19.218 }, 00:15:19.218 { 00:15:19.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.218 "dma_device_type": 2 00:15:19.218 } 00:15:19.218 ], 00:15:19.218 "driver_specific": {} 00:15:19.218 } 00:15:19.218 ] 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.218 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.476 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:19.476 "name": "Existed_Raid", 00:15:19.476 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.476 "strip_size_kb": 64, 00:15:19.476 "state": "configuring", 00:15:19.476 "raid_level": "concat", 00:15:19.476 "superblock": false, 00:15:19.476 "num_base_bdevs": 4, 00:15:19.476 "num_base_bdevs_discovered": 3, 00:15:19.476 "num_base_bdevs_operational": 4, 00:15:19.476 "base_bdevs_list": [ 00:15:19.476 { 00:15:19.476 "name": "BaseBdev1", 00:15:19.476 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:19.476 "is_configured": true, 00:15:19.476 "data_offset": 0, 00:15:19.476 "data_size": 65536 00:15:19.476 }, 00:15:19.476 { 00:15:19.476 "name": "BaseBdev2", 00:15:19.476 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:19.476 "is_configured": true, 00:15:19.476 "data_offset": 0, 00:15:19.476 "data_size": 65536 00:15:19.476 }, 00:15:19.476 { 00:15:19.476 "name": "BaseBdev3", 00:15:19.476 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:19.476 "is_configured": true, 00:15:19.476 "data_offset": 0, 00:15:19.477 "data_size": 65536 00:15:19.477 }, 00:15:19.477 { 00:15:19.477 "name": "BaseBdev4", 00:15:19.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.477 "is_configured": false, 00:15:19.477 "data_offset": 0, 00:15:19.477 "data_size": 0 00:15:19.477 } 00:15:19.477 ] 00:15:19.477 }' 00:15:19.477 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:19.477 04:16:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:20.043 04:16:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:20.301 [2024-05-15 04:16:08.216616] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:20.301 [2024-05-15 04:16:08.216666] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x23e77f0 00:15:20.301 [2024-05-15 04:16:08.216678] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:20.301 [2024-05-15 04:16:08.216894] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x259b5f0 00:15:20.301 [2024-05-15 04:16:08.217053] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23e77f0 00:15:20.301 [2024-05-15 04:16:08.217070] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x23e77f0 00:15:20.301 [2024-05-15 04:16:08.217285] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.301 BaseBdev4 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:20.301 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.559 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:20.817 [ 00:15:20.817 { 00:15:20.817 "name": "BaseBdev4", 00:15:20.817 "aliases": [ 00:15:20.817 "31b836b4-67db-4698-99c9-f87e0768f29e" 00:15:20.817 ], 00:15:20.817 "product_name": "Malloc disk", 00:15:20.817 "block_size": 512, 00:15:20.817 "num_blocks": 65536, 00:15:20.817 "uuid": "31b836b4-67db-4698-99c9-f87e0768f29e", 00:15:20.817 "assigned_rate_limits": { 00:15:20.817 "rw_ios_per_sec": 0, 00:15:20.817 "rw_mbytes_per_sec": 0, 00:15:20.817 "r_mbytes_per_sec": 0, 00:15:20.817 "w_mbytes_per_sec": 0 00:15:20.817 }, 00:15:20.817 "claimed": true, 00:15:20.817 "claim_type": "exclusive_write", 00:15:20.817 "zoned": false, 00:15:20.817 "supported_io_types": { 00:15:20.817 "read": true, 00:15:20.817 "write": true, 00:15:20.817 "unmap": true, 00:15:20.817 "write_zeroes": true, 00:15:20.817 "flush": true, 00:15:20.817 "reset": true, 00:15:20.817 "compare": false, 00:15:20.817 "compare_and_write": false, 00:15:20.817 "abort": true, 00:15:20.817 "nvme_admin": false, 00:15:20.817 "nvme_io": false 00:15:20.817 }, 00:15:20.817 "memory_domains": [ 00:15:20.817 { 00:15:20.817 "dma_device_id": "system", 00:15:20.817 "dma_device_type": 1 00:15:20.817 }, 00:15:20.817 { 00:15:20.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.817 "dma_device_type": 2 00:15:20.817 } 00:15:20.817 ], 00:15:20.817 "driver_specific": {} 00:15:20.817 } 00:15:20.817 ] 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.817 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.075 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:21.075 "name": "Existed_Raid", 00:15:21.075 "uuid": "1a85c1cc-23e3-461d-9836-3fcf5759ff0f", 00:15:21.075 "strip_size_kb": 64, 00:15:21.075 "state": "online", 00:15:21.075 "raid_level": "concat", 00:15:21.075 "superblock": false, 00:15:21.075 "num_base_bdevs": 4, 00:15:21.075 "num_base_bdevs_discovered": 4, 00:15:21.075 "num_base_bdevs_operational": 4, 00:15:21.075 "base_bdevs_list": [ 00:15:21.075 { 00:15:21.075 "name": "BaseBdev1", 00:15:21.075 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:21.075 "is_configured": true, 00:15:21.075 "data_offset": 0, 00:15:21.075 "data_size": 65536 00:15:21.075 }, 00:15:21.075 { 00:15:21.075 "name": "BaseBdev2", 00:15:21.075 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:21.075 "is_configured": true, 00:15:21.075 "data_offset": 0, 00:15:21.075 "data_size": 65536 00:15:21.075 }, 00:15:21.075 { 00:15:21.075 "name": "BaseBdev3", 00:15:21.075 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:21.075 "is_configured": true, 00:15:21.075 "data_offset": 0, 00:15:21.075 "data_size": 65536 00:15:21.075 }, 00:15:21.075 { 00:15:21.075 "name": "BaseBdev4", 00:15:21.075 "uuid": "31b836b4-67db-4698-99c9-f87e0768f29e", 00:15:21.075 "is_configured": true, 00:15:21.075 "data_offset": 0, 00:15:21.075 "data_size": 65536 00:15:21.075 } 00:15:21.075 ] 00:15:21.075 }' 00:15:21.075 04:16:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:21.075 04:16:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:21.640 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:21.898 [2024-05-15 04:16:09.741052] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:21.898 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:21.898 "name": "Existed_Raid", 00:15:21.898 "aliases": [ 00:15:21.898 "1a85c1cc-23e3-461d-9836-3fcf5759ff0f" 00:15:21.898 ], 00:15:21.898 "product_name": "Raid Volume", 00:15:21.898 "block_size": 512, 00:15:21.898 "num_blocks": 262144, 00:15:21.898 "uuid": "1a85c1cc-23e3-461d-9836-3fcf5759ff0f", 00:15:21.898 "assigned_rate_limits": { 00:15:21.898 "rw_ios_per_sec": 0, 00:15:21.898 "rw_mbytes_per_sec": 0, 00:15:21.898 "r_mbytes_per_sec": 0, 00:15:21.898 "w_mbytes_per_sec": 0 00:15:21.898 }, 00:15:21.899 "claimed": false, 00:15:21.899 "zoned": false, 00:15:21.899 "supported_io_types": { 00:15:21.899 "read": true, 00:15:21.899 "write": true, 00:15:21.899 "unmap": true, 00:15:21.899 "write_zeroes": true, 00:15:21.899 "flush": true, 00:15:21.899 "reset": true, 00:15:21.899 "compare": false, 00:15:21.899 "compare_and_write": false, 00:15:21.899 "abort": false, 00:15:21.899 "nvme_admin": false, 00:15:21.899 "nvme_io": false 00:15:21.899 }, 00:15:21.899 "memory_domains": [ 00:15:21.899 { 00:15:21.899 "dma_device_id": "system", 00:15:21.899 "dma_device_type": 1 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.899 "dma_device_type": 2 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "system", 00:15:21.899 "dma_device_type": 1 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.899 "dma_device_type": 2 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "system", 00:15:21.899 "dma_device_type": 1 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.899 "dma_device_type": 2 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "system", 00:15:21.899 "dma_device_type": 1 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.899 "dma_device_type": 2 00:15:21.899 } 00:15:21.899 ], 00:15:21.899 "driver_specific": { 00:15:21.899 "raid": { 00:15:21.899 "uuid": "1a85c1cc-23e3-461d-9836-3fcf5759ff0f", 00:15:21.899 "strip_size_kb": 64, 00:15:21.899 "state": "online", 00:15:21.899 "raid_level": "concat", 00:15:21.899 "superblock": false, 00:15:21.899 "num_base_bdevs": 4, 00:15:21.899 "num_base_bdevs_discovered": 4, 00:15:21.899 "num_base_bdevs_operational": 4, 00:15:21.899 "base_bdevs_list": [ 00:15:21.899 { 00:15:21.899 "name": "BaseBdev1", 00:15:21.899 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:21.899 "is_configured": true, 00:15:21.899 "data_offset": 0, 00:15:21.899 "data_size": 65536 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "name": "BaseBdev2", 00:15:21.899 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:21.899 "is_configured": true, 00:15:21.899 "data_offset": 0, 00:15:21.899 "data_size": 65536 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "name": "BaseBdev3", 00:15:21.899 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:21.899 "is_configured": true, 00:15:21.899 "data_offset": 0, 00:15:21.899 "data_size": 65536 00:15:21.899 }, 00:15:21.899 { 00:15:21.899 "name": "BaseBdev4", 00:15:21.899 "uuid": "31b836b4-67db-4698-99c9-f87e0768f29e", 00:15:21.899 "is_configured": true, 00:15:21.899 "data_offset": 0, 00:15:21.899 "data_size": 65536 00:15:21.899 } 00:15:21.899 ] 00:15:21.899 } 00:15:21.899 } 00:15:21.899 }' 00:15:21.899 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:21.899 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:21.899 BaseBdev2 00:15:21.899 BaseBdev3 00:15:21.899 BaseBdev4' 00:15:21.899 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:21.899 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:21.899 04:16:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:22.157 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:22.157 "name": "BaseBdev1", 00:15:22.157 "aliases": [ 00:15:22.157 "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef" 00:15:22.157 ], 00:15:22.157 "product_name": "Malloc disk", 00:15:22.157 "block_size": 512, 00:15:22.157 "num_blocks": 65536, 00:15:22.157 "uuid": "6b68f5c9-ec66-4816-9e08-d0b736e1a8ef", 00:15:22.157 "assigned_rate_limits": { 00:15:22.157 "rw_ios_per_sec": 0, 00:15:22.157 "rw_mbytes_per_sec": 0, 00:15:22.157 "r_mbytes_per_sec": 0, 00:15:22.157 "w_mbytes_per_sec": 0 00:15:22.157 }, 00:15:22.157 "claimed": true, 00:15:22.157 "claim_type": "exclusive_write", 00:15:22.157 "zoned": false, 00:15:22.157 "supported_io_types": { 00:15:22.157 "read": true, 00:15:22.157 "write": true, 00:15:22.157 "unmap": true, 00:15:22.157 "write_zeroes": true, 00:15:22.157 "flush": true, 00:15:22.157 "reset": true, 00:15:22.157 "compare": false, 00:15:22.157 "compare_and_write": false, 00:15:22.157 "abort": true, 00:15:22.157 "nvme_admin": false, 00:15:22.157 "nvme_io": false 00:15:22.157 }, 00:15:22.157 "memory_domains": [ 00:15:22.157 { 00:15:22.157 "dma_device_id": "system", 00:15:22.157 "dma_device_type": 1 00:15:22.157 }, 00:15:22.157 { 00:15:22.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.157 "dma_device_type": 2 00:15:22.157 } 00:15:22.157 ], 00:15:22.157 "driver_specific": {} 00:15:22.157 }' 00:15:22.157 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.157 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:22.415 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:22.673 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:22.673 "name": "BaseBdev2", 00:15:22.673 "aliases": [ 00:15:22.673 "30ad46ee-cded-49bf-bbb2-8309dee82223" 00:15:22.673 ], 00:15:22.673 "product_name": "Malloc disk", 00:15:22.673 "block_size": 512, 00:15:22.673 "num_blocks": 65536, 00:15:22.673 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:22.673 "assigned_rate_limits": { 00:15:22.673 "rw_ios_per_sec": 0, 00:15:22.673 "rw_mbytes_per_sec": 0, 00:15:22.673 "r_mbytes_per_sec": 0, 00:15:22.673 "w_mbytes_per_sec": 0 00:15:22.673 }, 00:15:22.673 "claimed": true, 00:15:22.673 "claim_type": "exclusive_write", 00:15:22.673 "zoned": false, 00:15:22.673 "supported_io_types": { 00:15:22.673 "read": true, 00:15:22.673 "write": true, 00:15:22.673 "unmap": true, 00:15:22.673 "write_zeroes": true, 00:15:22.673 "flush": true, 00:15:22.673 "reset": true, 00:15:22.673 "compare": false, 00:15:22.673 "compare_and_write": false, 00:15:22.673 "abort": true, 00:15:22.673 "nvme_admin": false, 00:15:22.673 "nvme_io": false 00:15:22.673 }, 00:15:22.673 "memory_domains": [ 00:15:22.673 { 00:15:22.673 "dma_device_id": "system", 00:15:22.673 "dma_device_type": 1 00:15:22.673 }, 00:15:22.673 { 00:15:22.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.673 "dma_device_type": 2 00:15:22.673 } 00:15:22.673 ], 00:15:22.673 "driver_specific": {} 00:15:22.673 }' 00:15:22.673 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.673 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:22.931 04:16:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:23.189 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:23.189 "name": "BaseBdev3", 00:15:23.189 "aliases": [ 00:15:23.189 "75e6d1be-292d-4711-9911-6b58fa46591e" 00:15:23.189 ], 00:15:23.189 "product_name": "Malloc disk", 00:15:23.189 "block_size": 512, 00:15:23.189 "num_blocks": 65536, 00:15:23.189 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:23.189 "assigned_rate_limits": { 00:15:23.189 "rw_ios_per_sec": 0, 00:15:23.189 "rw_mbytes_per_sec": 0, 00:15:23.189 "r_mbytes_per_sec": 0, 00:15:23.189 "w_mbytes_per_sec": 0 00:15:23.189 }, 00:15:23.189 "claimed": true, 00:15:23.189 "claim_type": "exclusive_write", 00:15:23.189 "zoned": false, 00:15:23.189 "supported_io_types": { 00:15:23.189 "read": true, 00:15:23.189 "write": true, 00:15:23.189 "unmap": true, 00:15:23.189 "write_zeroes": true, 00:15:23.189 "flush": true, 00:15:23.189 "reset": true, 00:15:23.189 "compare": false, 00:15:23.189 "compare_and_write": false, 00:15:23.189 "abort": true, 00:15:23.189 "nvme_admin": false, 00:15:23.189 "nvme_io": false 00:15:23.189 }, 00:15:23.189 "memory_domains": [ 00:15:23.189 { 00:15:23.189 "dma_device_id": "system", 00:15:23.189 "dma_device_type": 1 00:15:23.189 }, 00:15:23.189 { 00:15:23.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.189 "dma_device_type": 2 00:15:23.189 } 00:15:23.189 ], 00:15:23.189 "driver_specific": {} 00:15:23.189 }' 00:15:23.189 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.448 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:23.449 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:23.449 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.449 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:23.706 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:23.706 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:23.706 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:23.706 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:23.706 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:23.964 "name": "BaseBdev4", 00:15:23.964 "aliases": [ 00:15:23.964 "31b836b4-67db-4698-99c9-f87e0768f29e" 00:15:23.964 ], 00:15:23.964 "product_name": "Malloc disk", 00:15:23.964 "block_size": 512, 00:15:23.964 "num_blocks": 65536, 00:15:23.964 "uuid": "31b836b4-67db-4698-99c9-f87e0768f29e", 00:15:23.964 "assigned_rate_limits": { 00:15:23.964 "rw_ios_per_sec": 0, 00:15:23.964 "rw_mbytes_per_sec": 0, 00:15:23.964 "r_mbytes_per_sec": 0, 00:15:23.964 "w_mbytes_per_sec": 0 00:15:23.964 }, 00:15:23.964 "claimed": true, 00:15:23.964 "claim_type": "exclusive_write", 00:15:23.964 "zoned": false, 00:15:23.964 "supported_io_types": { 00:15:23.964 "read": true, 00:15:23.964 "write": true, 00:15:23.964 "unmap": true, 00:15:23.964 "write_zeroes": true, 00:15:23.964 "flush": true, 00:15:23.964 "reset": true, 00:15:23.964 "compare": false, 00:15:23.964 "compare_and_write": false, 00:15:23.964 "abort": true, 00:15:23.964 "nvme_admin": false, 00:15:23.964 "nvme_io": false 00:15:23.964 }, 00:15:23.964 "memory_domains": [ 00:15:23.964 { 00:15:23.964 "dma_device_id": "system", 00:15:23.964 "dma_device_type": 1 00:15:23.964 }, 00:15:23.964 { 00:15:23.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.964 "dma_device_type": 2 00:15:23.964 } 00:15:23.964 ], 00:15:23.964 "driver_specific": {} 00:15:23.964 }' 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:23.964 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:24.220 04:16:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:24.220 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:24.220 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:24.220 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:24.220 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:24.478 [2024-05-15 04:16:12.295569] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:24.478 [2024-05-15 04:16:12.295597] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:24.478 [2024-05-15 04:16:12.295654] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@216 -- # return 1 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.478 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.736 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:24.736 "name": "Existed_Raid", 00:15:24.736 "uuid": "1a85c1cc-23e3-461d-9836-3fcf5759ff0f", 00:15:24.736 "strip_size_kb": 64, 00:15:24.736 "state": "offline", 00:15:24.736 "raid_level": "concat", 00:15:24.736 "superblock": false, 00:15:24.736 "num_base_bdevs": 4, 00:15:24.736 "num_base_bdevs_discovered": 3, 00:15:24.736 "num_base_bdevs_operational": 3, 00:15:24.736 "base_bdevs_list": [ 00:15:24.736 { 00:15:24.736 "name": null, 00:15:24.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:24.736 "is_configured": false, 00:15:24.736 "data_offset": 0, 00:15:24.736 "data_size": 65536 00:15:24.736 }, 00:15:24.736 { 00:15:24.736 "name": "BaseBdev2", 00:15:24.736 "uuid": "30ad46ee-cded-49bf-bbb2-8309dee82223", 00:15:24.736 "is_configured": true, 00:15:24.736 "data_offset": 0, 00:15:24.736 "data_size": 65536 00:15:24.736 }, 00:15:24.736 { 00:15:24.736 "name": "BaseBdev3", 00:15:24.736 "uuid": "75e6d1be-292d-4711-9911-6b58fa46591e", 00:15:24.736 "is_configured": true, 00:15:24.736 "data_offset": 0, 00:15:24.736 "data_size": 65536 00:15:24.736 }, 00:15:24.736 { 00:15:24.736 "name": "BaseBdev4", 00:15:24.736 "uuid": "31b836b4-67db-4698-99c9-f87e0768f29e", 00:15:24.736 "is_configured": true, 00:15:24.736 "data_offset": 0, 00:15:24.736 "data_size": 65536 00:15:24.736 } 00:15:24.736 ] 00:15:24.736 }' 00:15:24.736 04:16:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:24.737 04:16:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:25.302 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:25.302 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:25.302 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.302 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:25.559 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:25.559 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:25.559 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:25.817 [2024-05-15 04:16:13.629569] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:25.817 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:25.817 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:25.817 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.817 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:26.075 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:26.075 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:26.075 04:16:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:26.333 [2024-05-15 04:16:14.172639] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:26.333 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:26.333 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:26.333 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.333 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:26.590 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:26.590 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:26.590 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:26.847 [2024-05-15 04:16:14.719897] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:26.847 [2024-05-15 04:16:14.719962] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23e77f0 name Existed_Raid, state offline 00:15:26.847 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:26.847 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:26.847 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.847 04:16:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:27.104 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:27.362 BaseBdev2 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:27.362 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.622 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:27.880 [ 00:15:27.880 { 00:15:27.880 "name": "BaseBdev2", 00:15:27.880 "aliases": [ 00:15:27.880 "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8" 00:15:27.880 ], 00:15:27.880 "product_name": "Malloc disk", 00:15:27.880 "block_size": 512, 00:15:27.880 "num_blocks": 65536, 00:15:27.880 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:27.880 "assigned_rate_limits": { 00:15:27.880 "rw_ios_per_sec": 0, 00:15:27.880 "rw_mbytes_per_sec": 0, 00:15:27.880 "r_mbytes_per_sec": 0, 00:15:27.880 "w_mbytes_per_sec": 0 00:15:27.880 }, 00:15:27.880 "claimed": false, 00:15:27.880 "zoned": false, 00:15:27.880 "supported_io_types": { 00:15:27.880 "read": true, 00:15:27.880 "write": true, 00:15:27.880 "unmap": true, 00:15:27.880 "write_zeroes": true, 00:15:27.880 "flush": true, 00:15:27.880 "reset": true, 00:15:27.880 "compare": false, 00:15:27.880 "compare_and_write": false, 00:15:27.880 "abort": true, 00:15:27.880 "nvme_admin": false, 00:15:27.880 "nvme_io": false 00:15:27.880 }, 00:15:27.880 "memory_domains": [ 00:15:27.880 { 00:15:27.880 "dma_device_id": "system", 00:15:27.880 "dma_device_type": 1 00:15:27.880 }, 00:15:27.880 { 00:15:27.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.880 "dma_device_type": 2 00:15:27.880 } 00:15:27.880 ], 00:15:27.880 "driver_specific": {} 00:15:27.880 } 00:15:27.880 ] 00:15:27.880 04:16:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:27.880 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:27.880 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:27.880 04:16:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:28.138 BaseBdev3 00:15:28.138 04:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:15:28.138 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:28.138 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:28.138 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:28.138 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:28.139 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:28.139 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:28.396 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:28.654 [ 00:15:28.654 { 00:15:28.654 "name": "BaseBdev3", 00:15:28.654 "aliases": [ 00:15:28.654 "ef9ade33-7b53-4e4e-b464-84d6b378b629" 00:15:28.654 ], 00:15:28.654 "product_name": "Malloc disk", 00:15:28.654 "block_size": 512, 00:15:28.654 "num_blocks": 65536, 00:15:28.654 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:28.654 "assigned_rate_limits": { 00:15:28.654 "rw_ios_per_sec": 0, 00:15:28.654 "rw_mbytes_per_sec": 0, 00:15:28.654 "r_mbytes_per_sec": 0, 00:15:28.654 "w_mbytes_per_sec": 0 00:15:28.654 }, 00:15:28.654 "claimed": false, 00:15:28.654 "zoned": false, 00:15:28.654 "supported_io_types": { 00:15:28.654 "read": true, 00:15:28.654 "write": true, 00:15:28.654 "unmap": true, 00:15:28.654 "write_zeroes": true, 00:15:28.654 "flush": true, 00:15:28.654 "reset": true, 00:15:28.654 "compare": false, 00:15:28.654 "compare_and_write": false, 00:15:28.654 "abort": true, 00:15:28.654 "nvme_admin": false, 00:15:28.654 "nvme_io": false 00:15:28.654 }, 00:15:28.654 "memory_domains": [ 00:15:28.654 { 00:15:28.654 "dma_device_id": "system", 00:15:28.654 "dma_device_type": 1 00:15:28.654 }, 00:15:28.654 { 00:15:28.654 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.654 "dma_device_type": 2 00:15:28.654 } 00:15:28.654 ], 00:15:28.654 "driver_specific": {} 00:15:28.654 } 00:15:28.654 ] 00:15:28.654 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:28.654 04:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:28.654 04:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:28.654 04:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:28.912 BaseBdev4 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:28.912 04:16:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:29.170 04:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:29.429 [ 00:15:29.429 { 00:15:29.429 "name": "BaseBdev4", 00:15:29.429 "aliases": [ 00:15:29.429 "1ada3b53-487c-46b4-a9a5-8b741ea6a927" 00:15:29.429 ], 00:15:29.429 "product_name": "Malloc disk", 00:15:29.429 "block_size": 512, 00:15:29.429 "num_blocks": 65536, 00:15:29.429 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:29.429 "assigned_rate_limits": { 00:15:29.429 "rw_ios_per_sec": 0, 00:15:29.429 "rw_mbytes_per_sec": 0, 00:15:29.429 "r_mbytes_per_sec": 0, 00:15:29.429 "w_mbytes_per_sec": 0 00:15:29.429 }, 00:15:29.429 "claimed": false, 00:15:29.429 "zoned": false, 00:15:29.429 "supported_io_types": { 00:15:29.429 "read": true, 00:15:29.429 "write": true, 00:15:29.429 "unmap": true, 00:15:29.429 "write_zeroes": true, 00:15:29.429 "flush": true, 00:15:29.429 "reset": true, 00:15:29.429 "compare": false, 00:15:29.429 "compare_and_write": false, 00:15:29.429 "abort": true, 00:15:29.429 "nvme_admin": false, 00:15:29.429 "nvme_io": false 00:15:29.429 }, 00:15:29.429 "memory_domains": [ 00:15:29.429 { 00:15:29.429 "dma_device_id": "system", 00:15:29.429 "dma_device_type": 1 00:15:29.429 }, 00:15:29.429 { 00:15:29.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:29.429 "dma_device_type": 2 00:15:29.429 } 00:15:29.429 ], 00:15:29.429 "driver_specific": {} 00:15:29.429 } 00:15:29.429 ] 00:15:29.429 04:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:29.429 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:15:29.429 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:29.429 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:29.688 [2024-05-15 04:16:17.643481] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:29.688 [2024-05-15 04:16:17.643528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:29.688 [2024-05-15 04:16:17.643565] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:29.688 [2024-05-15 04:16:17.644991] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:29.688 [2024-05-15 04:16:17.645041] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.688 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.946 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:29.946 "name": "Existed_Raid", 00:15:29.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.946 "strip_size_kb": 64, 00:15:29.946 "state": "configuring", 00:15:29.946 "raid_level": "concat", 00:15:29.946 "superblock": false, 00:15:29.946 "num_base_bdevs": 4, 00:15:29.946 "num_base_bdevs_discovered": 3, 00:15:29.946 "num_base_bdevs_operational": 4, 00:15:29.946 "base_bdevs_list": [ 00:15:29.946 { 00:15:29.946 "name": "BaseBdev1", 00:15:29.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.946 "is_configured": false, 00:15:29.946 "data_offset": 0, 00:15:29.946 "data_size": 0 00:15:29.946 }, 00:15:29.946 { 00:15:29.946 "name": "BaseBdev2", 00:15:29.946 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:29.946 "is_configured": true, 00:15:29.946 "data_offset": 0, 00:15:29.946 "data_size": 65536 00:15:29.946 }, 00:15:29.946 { 00:15:29.946 "name": "BaseBdev3", 00:15:29.946 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:29.946 "is_configured": true, 00:15:29.946 "data_offset": 0, 00:15:29.946 "data_size": 65536 00:15:29.946 }, 00:15:29.946 { 00:15:29.946 "name": "BaseBdev4", 00:15:29.946 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:29.946 "is_configured": true, 00:15:29.946 "data_offset": 0, 00:15:29.946 "data_size": 65536 00:15:29.946 } 00:15:29.946 ] 00:15:29.946 }' 00:15:29.946 04:16:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:29.946 04:16:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:30.879 [2024-05-15 04:16:18.814566] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.879 04:16:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.137 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:31.137 "name": "Existed_Raid", 00:15:31.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.137 "strip_size_kb": 64, 00:15:31.137 "state": "configuring", 00:15:31.137 "raid_level": "concat", 00:15:31.137 "superblock": false, 00:15:31.137 "num_base_bdevs": 4, 00:15:31.137 "num_base_bdevs_discovered": 2, 00:15:31.137 "num_base_bdevs_operational": 4, 00:15:31.137 "base_bdevs_list": [ 00:15:31.137 { 00:15:31.137 "name": "BaseBdev1", 00:15:31.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.137 "is_configured": false, 00:15:31.137 "data_offset": 0, 00:15:31.137 "data_size": 0 00:15:31.137 }, 00:15:31.137 { 00:15:31.137 "name": null, 00:15:31.137 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:31.137 "is_configured": false, 00:15:31.138 "data_offset": 0, 00:15:31.138 "data_size": 65536 00:15:31.138 }, 00:15:31.138 { 00:15:31.138 "name": "BaseBdev3", 00:15:31.138 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:31.138 "is_configured": true, 00:15:31.138 "data_offset": 0, 00:15:31.138 "data_size": 65536 00:15:31.138 }, 00:15:31.138 { 00:15:31.138 "name": "BaseBdev4", 00:15:31.138 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:31.138 "is_configured": true, 00:15:31.138 "data_offset": 0, 00:15:31.138 "data_size": 65536 00:15:31.138 } 00:15:31.138 ] 00:15:31.138 }' 00:15:31.138 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:31.138 04:16:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:31.704 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.704 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:31.962 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:15:31.962 04:16:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:32.220 [2024-05-15 04:16:20.144530] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:32.220 BaseBdev1 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:32.220 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.478 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:32.736 [ 00:15:32.736 { 00:15:32.736 "name": "BaseBdev1", 00:15:32.736 "aliases": [ 00:15:32.736 "191333d1-34e1-40b1-8dd4-ff2300c77d29" 00:15:32.736 ], 00:15:32.736 "product_name": "Malloc disk", 00:15:32.736 "block_size": 512, 00:15:32.736 "num_blocks": 65536, 00:15:32.736 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:32.736 "assigned_rate_limits": { 00:15:32.736 "rw_ios_per_sec": 0, 00:15:32.736 "rw_mbytes_per_sec": 0, 00:15:32.736 "r_mbytes_per_sec": 0, 00:15:32.736 "w_mbytes_per_sec": 0 00:15:32.736 }, 00:15:32.736 "claimed": true, 00:15:32.736 "claim_type": "exclusive_write", 00:15:32.736 "zoned": false, 00:15:32.736 "supported_io_types": { 00:15:32.736 "read": true, 00:15:32.736 "write": true, 00:15:32.736 "unmap": true, 00:15:32.736 "write_zeroes": true, 00:15:32.736 "flush": true, 00:15:32.736 "reset": true, 00:15:32.736 "compare": false, 00:15:32.736 "compare_and_write": false, 00:15:32.736 "abort": true, 00:15:32.736 "nvme_admin": false, 00:15:32.736 "nvme_io": false 00:15:32.736 }, 00:15:32.736 "memory_domains": [ 00:15:32.736 { 00:15:32.736 "dma_device_id": "system", 00:15:32.736 "dma_device_type": 1 00:15:32.736 }, 00:15:32.736 { 00:15:32.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.737 "dma_device_type": 2 00:15:32.737 } 00:15:32.737 ], 00:15:32.737 "driver_specific": {} 00:15:32.737 } 00:15:32.737 ] 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.737 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.995 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:32.995 "name": "Existed_Raid", 00:15:32.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.995 "strip_size_kb": 64, 00:15:32.995 "state": "configuring", 00:15:32.995 "raid_level": "concat", 00:15:32.995 "superblock": false, 00:15:32.995 "num_base_bdevs": 4, 00:15:32.995 "num_base_bdevs_discovered": 3, 00:15:32.995 "num_base_bdevs_operational": 4, 00:15:32.995 "base_bdevs_list": [ 00:15:32.995 { 00:15:32.995 "name": "BaseBdev1", 00:15:32.995 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:32.995 "is_configured": true, 00:15:32.995 "data_offset": 0, 00:15:32.995 "data_size": 65536 00:15:32.995 }, 00:15:32.995 { 00:15:32.995 "name": null, 00:15:32.995 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:32.995 "is_configured": false, 00:15:32.995 "data_offset": 0, 00:15:32.995 "data_size": 65536 00:15:32.995 }, 00:15:32.995 { 00:15:32.995 "name": "BaseBdev3", 00:15:32.995 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:32.995 "is_configured": true, 00:15:32.995 "data_offset": 0, 00:15:32.995 "data_size": 65536 00:15:32.995 }, 00:15:32.995 { 00:15:32.995 "name": "BaseBdev4", 00:15:32.995 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:32.995 "is_configured": true, 00:15:32.995 "data_offset": 0, 00:15:32.995 "data_size": 65536 00:15:32.995 } 00:15:32.995 ] 00:15:32.995 }' 00:15:32.995 04:16:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:32.995 04:16:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.559 04:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.559 04:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:33.818 04:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:15:33.818 04:16:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:34.077 [2024-05-15 04:16:22.053661] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.077 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.335 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:34.335 "name": "Existed_Raid", 00:15:34.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.335 "strip_size_kb": 64, 00:15:34.335 "state": "configuring", 00:15:34.335 "raid_level": "concat", 00:15:34.335 "superblock": false, 00:15:34.335 "num_base_bdevs": 4, 00:15:34.335 "num_base_bdevs_discovered": 2, 00:15:34.335 "num_base_bdevs_operational": 4, 00:15:34.335 "base_bdevs_list": [ 00:15:34.335 { 00:15:34.335 "name": "BaseBdev1", 00:15:34.335 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:34.335 "is_configured": true, 00:15:34.335 "data_offset": 0, 00:15:34.335 "data_size": 65536 00:15:34.335 }, 00:15:34.335 { 00:15:34.335 "name": null, 00:15:34.335 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:34.335 "is_configured": false, 00:15:34.335 "data_offset": 0, 00:15:34.335 "data_size": 65536 00:15:34.335 }, 00:15:34.335 { 00:15:34.335 "name": null, 00:15:34.335 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:34.335 "is_configured": false, 00:15:34.335 "data_offset": 0, 00:15:34.335 "data_size": 65536 00:15:34.335 }, 00:15:34.335 { 00:15:34.335 "name": "BaseBdev4", 00:15:34.335 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:34.335 "is_configured": true, 00:15:34.335 "data_offset": 0, 00:15:34.335 "data_size": 65536 00:15:34.335 } 00:15:34.335 ] 00:15:34.335 }' 00:15:34.335 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:34.335 04:16:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:34.934 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.934 04:16:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:35.217 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:15:35.217 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:35.476 [2024-05-15 04:16:23.317072] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.476 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.735 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:35.735 "name": "Existed_Raid", 00:15:35.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:35.735 "strip_size_kb": 64, 00:15:35.735 "state": "configuring", 00:15:35.735 "raid_level": "concat", 00:15:35.735 "superblock": false, 00:15:35.735 "num_base_bdevs": 4, 00:15:35.735 "num_base_bdevs_discovered": 3, 00:15:35.735 "num_base_bdevs_operational": 4, 00:15:35.735 "base_bdevs_list": [ 00:15:35.735 { 00:15:35.735 "name": "BaseBdev1", 00:15:35.735 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:35.735 "is_configured": true, 00:15:35.735 "data_offset": 0, 00:15:35.735 "data_size": 65536 00:15:35.735 }, 00:15:35.735 { 00:15:35.735 "name": null, 00:15:35.735 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:35.735 "is_configured": false, 00:15:35.735 "data_offset": 0, 00:15:35.735 "data_size": 65536 00:15:35.735 }, 00:15:35.735 { 00:15:35.735 "name": "BaseBdev3", 00:15:35.735 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:35.735 "is_configured": true, 00:15:35.735 "data_offset": 0, 00:15:35.735 "data_size": 65536 00:15:35.735 }, 00:15:35.735 { 00:15:35.735 "name": "BaseBdev4", 00:15:35.735 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:35.735 "is_configured": true, 00:15:35.735 "data_offset": 0, 00:15:35.735 "data_size": 65536 00:15:35.735 } 00:15:35.735 ] 00:15:35.735 }' 00:15:35.735 04:16:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:35.735 04:16:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.300 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.300 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:36.558 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:15:36.558 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:36.816 [2024-05-15 04:16:24.708703] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.816 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.074 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:37.074 "name": "Existed_Raid", 00:15:37.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.074 "strip_size_kb": 64, 00:15:37.074 "state": "configuring", 00:15:37.074 "raid_level": "concat", 00:15:37.074 "superblock": false, 00:15:37.074 "num_base_bdevs": 4, 00:15:37.074 "num_base_bdevs_discovered": 2, 00:15:37.074 "num_base_bdevs_operational": 4, 00:15:37.074 "base_bdevs_list": [ 00:15:37.074 { 00:15:37.074 "name": null, 00:15:37.074 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:37.074 "is_configured": false, 00:15:37.074 "data_offset": 0, 00:15:37.074 "data_size": 65536 00:15:37.074 }, 00:15:37.074 { 00:15:37.074 "name": null, 00:15:37.074 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:37.074 "is_configured": false, 00:15:37.074 "data_offset": 0, 00:15:37.074 "data_size": 65536 00:15:37.074 }, 00:15:37.074 { 00:15:37.074 "name": "BaseBdev3", 00:15:37.074 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:37.074 "is_configured": true, 00:15:37.074 "data_offset": 0, 00:15:37.074 "data_size": 65536 00:15:37.074 }, 00:15:37.074 { 00:15:37.074 "name": "BaseBdev4", 00:15:37.074 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:37.074 "is_configured": true, 00:15:37.074 "data_offset": 0, 00:15:37.074 "data_size": 65536 00:15:37.074 } 00:15:37.074 ] 00:15:37.074 }' 00:15:37.074 04:16:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:37.074 04:16:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.638 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.638 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:37.896 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:15:37.896 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:38.153 [2024-05-15 04:16:25.940882] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.153 04:16:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.410 04:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:38.410 "name": "Existed_Raid", 00:15:38.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.410 "strip_size_kb": 64, 00:15:38.410 "state": "configuring", 00:15:38.410 "raid_level": "concat", 00:15:38.410 "superblock": false, 00:15:38.410 "num_base_bdevs": 4, 00:15:38.410 "num_base_bdevs_discovered": 3, 00:15:38.410 "num_base_bdevs_operational": 4, 00:15:38.410 "base_bdevs_list": [ 00:15:38.410 { 00:15:38.410 "name": null, 00:15:38.410 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:38.410 "is_configured": false, 00:15:38.410 "data_offset": 0, 00:15:38.410 "data_size": 65536 00:15:38.410 }, 00:15:38.410 { 00:15:38.410 "name": "BaseBdev2", 00:15:38.410 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:38.410 "is_configured": true, 00:15:38.410 "data_offset": 0, 00:15:38.410 "data_size": 65536 00:15:38.410 }, 00:15:38.410 { 00:15:38.410 "name": "BaseBdev3", 00:15:38.410 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:38.410 "is_configured": true, 00:15:38.410 "data_offset": 0, 00:15:38.410 "data_size": 65536 00:15:38.410 }, 00:15:38.410 { 00:15:38.410 "name": "BaseBdev4", 00:15:38.410 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:38.410 "is_configured": true, 00:15:38.410 "data_offset": 0, 00:15:38.410 "data_size": 65536 00:15:38.410 } 00:15:38.410 ] 00:15:38.410 }' 00:15:38.410 04:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:38.410 04:16:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:39.013 04:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.013 04:16:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:39.013 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:15:39.013 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.013 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:39.270 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 191333d1-34e1-40b1-8dd4-ff2300c77d29 00:15:39.526 [2024-05-15 04:16:27.496165] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:39.527 [2024-05-15 04:16:27.496221] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x258d340 00:15:39.527 [2024-05-15 04:16:27.496230] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:15:39.527 [2024-05-15 04:16:27.496421] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258f150 00:15:39.527 [2024-05-15 04:16:27.496559] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x258d340 00:15:39.527 [2024-05-15 04:16:27.496572] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x258d340 00:15:39.527 [2024-05-15 04:16:27.496778] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:39.527 NewBaseBdev 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:39.527 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:39.784 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:40.041 [ 00:15:40.041 { 00:15:40.041 "name": "NewBaseBdev", 00:15:40.041 "aliases": [ 00:15:40.041 "191333d1-34e1-40b1-8dd4-ff2300c77d29" 00:15:40.041 ], 00:15:40.041 "product_name": "Malloc disk", 00:15:40.041 "block_size": 512, 00:15:40.041 "num_blocks": 65536, 00:15:40.041 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:40.041 "assigned_rate_limits": { 00:15:40.041 "rw_ios_per_sec": 0, 00:15:40.041 "rw_mbytes_per_sec": 0, 00:15:40.041 "r_mbytes_per_sec": 0, 00:15:40.041 "w_mbytes_per_sec": 0 00:15:40.041 }, 00:15:40.041 "claimed": true, 00:15:40.041 "claim_type": "exclusive_write", 00:15:40.041 "zoned": false, 00:15:40.041 "supported_io_types": { 00:15:40.041 "read": true, 00:15:40.041 "write": true, 00:15:40.041 "unmap": true, 00:15:40.041 "write_zeroes": true, 00:15:40.041 "flush": true, 00:15:40.041 "reset": true, 00:15:40.041 "compare": false, 00:15:40.041 "compare_and_write": false, 00:15:40.041 "abort": true, 00:15:40.041 "nvme_admin": false, 00:15:40.041 "nvme_io": false 00:15:40.041 }, 00:15:40.041 "memory_domains": [ 00:15:40.041 { 00:15:40.041 "dma_device_id": "system", 00:15:40.041 "dma_device_type": 1 00:15:40.041 }, 00:15:40.041 { 00:15:40.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.041 "dma_device_type": 2 00:15:40.041 } 00:15:40.041 ], 00:15:40.041 "driver_specific": {} 00:15:40.042 } 00:15:40.042 ] 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.042 04:16:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.299 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:40.299 "name": "Existed_Raid", 00:15:40.299 "uuid": "28859f4d-4b15-4b34-b7c2-996c9b117333", 00:15:40.299 "strip_size_kb": 64, 00:15:40.299 "state": "online", 00:15:40.299 "raid_level": "concat", 00:15:40.299 "superblock": false, 00:15:40.299 "num_base_bdevs": 4, 00:15:40.299 "num_base_bdevs_discovered": 4, 00:15:40.299 "num_base_bdevs_operational": 4, 00:15:40.299 "base_bdevs_list": [ 00:15:40.299 { 00:15:40.299 "name": "NewBaseBdev", 00:15:40.299 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:40.299 "is_configured": true, 00:15:40.299 "data_offset": 0, 00:15:40.299 "data_size": 65536 00:15:40.299 }, 00:15:40.299 { 00:15:40.299 "name": "BaseBdev2", 00:15:40.299 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:40.299 "is_configured": true, 00:15:40.299 "data_offset": 0, 00:15:40.299 "data_size": 65536 00:15:40.299 }, 00:15:40.299 { 00:15:40.299 "name": "BaseBdev3", 00:15:40.299 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:40.299 "is_configured": true, 00:15:40.299 "data_offset": 0, 00:15:40.299 "data_size": 65536 00:15:40.299 }, 00:15:40.299 { 00:15:40.299 "name": "BaseBdev4", 00:15:40.299 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:40.299 "is_configured": true, 00:15:40.299 "data_offset": 0, 00:15:40.299 "data_size": 65536 00:15:40.299 } 00:15:40.299 ] 00:15:40.299 }' 00:15:40.299 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:40.299 04:16:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:40.863 04:16:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:41.120 [2024-05-15 04:16:29.012394] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:41.120 "name": "Existed_Raid", 00:15:41.120 "aliases": [ 00:15:41.120 "28859f4d-4b15-4b34-b7c2-996c9b117333" 00:15:41.120 ], 00:15:41.120 "product_name": "Raid Volume", 00:15:41.120 "block_size": 512, 00:15:41.120 "num_blocks": 262144, 00:15:41.120 "uuid": "28859f4d-4b15-4b34-b7c2-996c9b117333", 00:15:41.120 "assigned_rate_limits": { 00:15:41.120 "rw_ios_per_sec": 0, 00:15:41.120 "rw_mbytes_per_sec": 0, 00:15:41.120 "r_mbytes_per_sec": 0, 00:15:41.120 "w_mbytes_per_sec": 0 00:15:41.120 }, 00:15:41.120 "claimed": false, 00:15:41.120 "zoned": false, 00:15:41.120 "supported_io_types": { 00:15:41.120 "read": true, 00:15:41.120 "write": true, 00:15:41.120 "unmap": true, 00:15:41.120 "write_zeroes": true, 00:15:41.120 "flush": true, 00:15:41.120 "reset": true, 00:15:41.120 "compare": false, 00:15:41.120 "compare_and_write": false, 00:15:41.120 "abort": false, 00:15:41.120 "nvme_admin": false, 00:15:41.120 "nvme_io": false 00:15:41.120 }, 00:15:41.120 "memory_domains": [ 00:15:41.120 { 00:15:41.120 "dma_device_id": "system", 00:15:41.120 "dma_device_type": 1 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.120 "dma_device_type": 2 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "system", 00:15:41.120 "dma_device_type": 1 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.120 "dma_device_type": 2 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "system", 00:15:41.120 "dma_device_type": 1 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.120 "dma_device_type": 2 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "system", 00:15:41.120 "dma_device_type": 1 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.120 "dma_device_type": 2 00:15:41.120 } 00:15:41.120 ], 00:15:41.120 "driver_specific": { 00:15:41.120 "raid": { 00:15:41.120 "uuid": "28859f4d-4b15-4b34-b7c2-996c9b117333", 00:15:41.120 "strip_size_kb": 64, 00:15:41.120 "state": "online", 00:15:41.120 "raid_level": "concat", 00:15:41.120 "superblock": false, 00:15:41.120 "num_base_bdevs": 4, 00:15:41.120 "num_base_bdevs_discovered": 4, 00:15:41.120 "num_base_bdevs_operational": 4, 00:15:41.120 "base_bdevs_list": [ 00:15:41.120 { 00:15:41.120 "name": "NewBaseBdev", 00:15:41.120 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:41.120 "is_configured": true, 00:15:41.120 "data_offset": 0, 00:15:41.120 "data_size": 65536 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "name": "BaseBdev2", 00:15:41.120 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:41.120 "is_configured": true, 00:15:41.120 "data_offset": 0, 00:15:41.120 "data_size": 65536 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "name": "BaseBdev3", 00:15:41.120 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:41.120 "is_configured": true, 00:15:41.120 "data_offset": 0, 00:15:41.120 "data_size": 65536 00:15:41.120 }, 00:15:41.120 { 00:15:41.120 "name": "BaseBdev4", 00:15:41.120 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:41.120 "is_configured": true, 00:15:41.120 "data_offset": 0, 00:15:41.120 "data_size": 65536 00:15:41.120 } 00:15:41.120 ] 00:15:41.120 } 00:15:41.120 } 00:15:41.120 }' 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:15:41.120 BaseBdev2 00:15:41.120 BaseBdev3 00:15:41.120 BaseBdev4' 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:41.120 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:41.379 "name": "NewBaseBdev", 00:15:41.379 "aliases": [ 00:15:41.379 "191333d1-34e1-40b1-8dd4-ff2300c77d29" 00:15:41.379 ], 00:15:41.379 "product_name": "Malloc disk", 00:15:41.379 "block_size": 512, 00:15:41.379 "num_blocks": 65536, 00:15:41.379 "uuid": "191333d1-34e1-40b1-8dd4-ff2300c77d29", 00:15:41.379 "assigned_rate_limits": { 00:15:41.379 "rw_ios_per_sec": 0, 00:15:41.379 "rw_mbytes_per_sec": 0, 00:15:41.379 "r_mbytes_per_sec": 0, 00:15:41.379 "w_mbytes_per_sec": 0 00:15:41.379 }, 00:15:41.379 "claimed": true, 00:15:41.379 "claim_type": "exclusive_write", 00:15:41.379 "zoned": false, 00:15:41.379 "supported_io_types": { 00:15:41.379 "read": true, 00:15:41.379 "write": true, 00:15:41.379 "unmap": true, 00:15:41.379 "write_zeroes": true, 00:15:41.379 "flush": true, 00:15:41.379 "reset": true, 00:15:41.379 "compare": false, 00:15:41.379 "compare_and_write": false, 00:15:41.379 "abort": true, 00:15:41.379 "nvme_admin": false, 00:15:41.379 "nvme_io": false 00:15:41.379 }, 00:15:41.379 "memory_domains": [ 00:15:41.379 { 00:15:41.379 "dma_device_id": "system", 00:15:41.379 "dma_device_type": 1 00:15:41.379 }, 00:15:41.379 { 00:15:41.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.379 "dma_device_type": 2 00:15:41.379 } 00:15:41.379 ], 00:15:41.379 "driver_specific": {} 00:15:41.379 }' 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:41.379 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:41.637 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:41.895 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:41.895 "name": "BaseBdev2", 00:15:41.895 "aliases": [ 00:15:41.895 "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8" 00:15:41.895 ], 00:15:41.895 "product_name": "Malloc disk", 00:15:41.895 "block_size": 512, 00:15:41.895 "num_blocks": 65536, 00:15:41.895 "uuid": "7bb737d0-22d3-4cd2-a78f-4f43c5590ba8", 00:15:41.895 "assigned_rate_limits": { 00:15:41.895 "rw_ios_per_sec": 0, 00:15:41.895 "rw_mbytes_per_sec": 0, 00:15:41.895 "r_mbytes_per_sec": 0, 00:15:41.895 "w_mbytes_per_sec": 0 00:15:41.895 }, 00:15:41.895 "claimed": true, 00:15:41.895 "claim_type": "exclusive_write", 00:15:41.895 "zoned": false, 00:15:41.895 "supported_io_types": { 00:15:41.895 "read": true, 00:15:41.895 "write": true, 00:15:41.895 "unmap": true, 00:15:41.895 "write_zeroes": true, 00:15:41.895 "flush": true, 00:15:41.895 "reset": true, 00:15:41.895 "compare": false, 00:15:41.895 "compare_and_write": false, 00:15:41.895 "abort": true, 00:15:41.895 "nvme_admin": false, 00:15:41.895 "nvme_io": false 00:15:41.895 }, 00:15:41.895 "memory_domains": [ 00:15:41.895 { 00:15:41.895 "dma_device_id": "system", 00:15:41.895 "dma_device_type": 1 00:15:41.895 }, 00:15:41.895 { 00:15:41.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.895 "dma_device_type": 2 00:15:41.895 } 00:15:41.895 ], 00:15:41.895 "driver_specific": {} 00:15:41.895 }' 00:15:41.895 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:41.895 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:41.895 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:41.895 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:42.153 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:42.153 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.153 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:42.153 04:16:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:42.153 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:42.411 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:42.411 "name": "BaseBdev3", 00:15:42.411 "aliases": [ 00:15:42.411 "ef9ade33-7b53-4e4e-b464-84d6b378b629" 00:15:42.411 ], 00:15:42.411 "product_name": "Malloc disk", 00:15:42.411 "block_size": 512, 00:15:42.411 "num_blocks": 65536, 00:15:42.411 "uuid": "ef9ade33-7b53-4e4e-b464-84d6b378b629", 00:15:42.411 "assigned_rate_limits": { 00:15:42.411 "rw_ios_per_sec": 0, 00:15:42.411 "rw_mbytes_per_sec": 0, 00:15:42.411 "r_mbytes_per_sec": 0, 00:15:42.411 "w_mbytes_per_sec": 0 00:15:42.411 }, 00:15:42.411 "claimed": true, 00:15:42.411 "claim_type": "exclusive_write", 00:15:42.411 "zoned": false, 00:15:42.411 "supported_io_types": { 00:15:42.411 "read": true, 00:15:42.411 "write": true, 00:15:42.411 "unmap": true, 00:15:42.411 "write_zeroes": true, 00:15:42.411 "flush": true, 00:15:42.411 "reset": true, 00:15:42.411 "compare": false, 00:15:42.411 "compare_and_write": false, 00:15:42.411 "abort": true, 00:15:42.411 "nvme_admin": false, 00:15:42.411 "nvme_io": false 00:15:42.411 }, 00:15:42.411 "memory_domains": [ 00:15:42.411 { 00:15:42.411 "dma_device_id": "system", 00:15:42.411 "dma_device_type": 1 00:15:42.411 }, 00:15:42.411 { 00:15:42.411 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.411 "dma_device_type": 2 00:15:42.411 } 00:15:42.411 ], 00:15:42.411 "driver_specific": {} 00:15:42.411 }' 00:15:42.411 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:42.411 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:42.411 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:42.411 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:42.669 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:42.926 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:42.926 "name": "BaseBdev4", 00:15:42.926 "aliases": [ 00:15:42.926 "1ada3b53-487c-46b4-a9a5-8b741ea6a927" 00:15:42.926 ], 00:15:42.926 "product_name": "Malloc disk", 00:15:42.926 "block_size": 512, 00:15:42.926 "num_blocks": 65536, 00:15:42.926 "uuid": "1ada3b53-487c-46b4-a9a5-8b741ea6a927", 00:15:42.926 "assigned_rate_limits": { 00:15:42.926 "rw_ios_per_sec": 0, 00:15:42.926 "rw_mbytes_per_sec": 0, 00:15:42.926 "r_mbytes_per_sec": 0, 00:15:42.926 "w_mbytes_per_sec": 0 00:15:42.926 }, 00:15:42.926 "claimed": true, 00:15:42.926 "claim_type": "exclusive_write", 00:15:42.926 "zoned": false, 00:15:42.926 "supported_io_types": { 00:15:42.926 "read": true, 00:15:42.926 "write": true, 00:15:42.926 "unmap": true, 00:15:42.926 "write_zeroes": true, 00:15:42.926 "flush": true, 00:15:42.926 "reset": true, 00:15:42.926 "compare": false, 00:15:42.926 "compare_and_write": false, 00:15:42.926 "abort": true, 00:15:42.926 "nvme_admin": false, 00:15:42.926 "nvme_io": false 00:15:42.926 }, 00:15:42.926 "memory_domains": [ 00:15:42.926 { 00:15:42.926 "dma_device_id": "system", 00:15:42.926 "dma_device_type": 1 00:15:42.926 }, 00:15:42.926 { 00:15:42.926 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.926 "dma_device_type": 2 00:15:42.926 } 00:15:42.926 ], 00:15:42.926 "driver_specific": {} 00:15:42.926 }' 00:15:42.926 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:42.927 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:42.927 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:42.927 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:43.184 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:43.184 04:16:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:43.184 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:43.442 [2024-05-15 04:16:31.418567] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:43.442 [2024-05-15 04:16:31.418602] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:43.442 [2024-05-15 04:16:31.418677] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:43.442 [2024-05-15 04:16:31.418743] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:43.442 [2024-05-15 04:16:31.418756] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258d340 name Existed_Raid, state offline 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3881231 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3881231 ']' 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3881231 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:15:43.442 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3881231 00:15:43.700 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:15:43.700 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:15:43.700 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3881231' 00:15:43.700 killing process with pid 3881231 00:15:43.700 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3881231 00:15:43.700 [2024-05-15 04:16:31.465466] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:43.700 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3881231 00:15:43.700 [2024-05-15 04:16:31.549407] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:43.958 04:16:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:15:43.958 00:15:43.958 real 0m32.534s 00:15:43.958 user 1m0.581s 00:15:43.958 sys 0m4.368s 00:15:43.958 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:15:43.958 04:16:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.958 ************************************ 00:15:43.958 END TEST raid_state_function_test 00:15:43.958 ************************************ 00:15:44.217 04:16:31 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:15:44.217 04:16:31 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:15:44.217 04:16:31 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:15:44.217 04:16:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:44.217 ************************************ 00:15:44.217 START TEST raid_state_function_test_sb 00:15:44.217 ************************************ 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test concat 4 true 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=concat 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' concat '!=' raid1 ']' 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size=64 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@233 -- # strip_size_create_arg='-z 64' 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3885756 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3885756' 00:15:44.217 Process raid pid: 3885756 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3885756 /var/tmp/spdk-raid.sock 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3885756 ']' 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:44.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:15:44.217 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.217 [2024-05-15 04:16:32.066169] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:15:44.217 [2024-05-15 04:16:32.066255] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:44.217 [2024-05-15 04:16:32.149878] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:44.476 [2024-05-15 04:16:32.271018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:15:44.476 [2024-05-15 04:16:32.346946] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.476 [2024-05-15 04:16:32.346998] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.042 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:15:45.042 04:16:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:15:45.042 04:16:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:45.300 [2024-05-15 04:16:33.203887] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:45.300 [2024-05-15 04:16:33.203932] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:45.300 [2024-05-15 04:16:33.203945] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:45.300 [2024-05-15 04:16:33.203958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:45.300 [2024-05-15 04:16:33.203967] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:45.300 [2024-05-15 04:16:33.203980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:45.300 [2024-05-15 04:16:33.203996] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:45.300 [2024-05-15 04:16:33.204009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:45.300 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.301 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.558 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:45.558 "name": "Existed_Raid", 00:15:45.558 "uuid": "78650a50-a8dd-458e-913c-1d1368d8cc41", 00:15:45.558 "strip_size_kb": 64, 00:15:45.558 "state": "configuring", 00:15:45.558 "raid_level": "concat", 00:15:45.558 "superblock": true, 00:15:45.558 "num_base_bdevs": 4, 00:15:45.558 "num_base_bdevs_discovered": 0, 00:15:45.558 "num_base_bdevs_operational": 4, 00:15:45.558 "base_bdevs_list": [ 00:15:45.558 { 00:15:45.558 "name": "BaseBdev1", 00:15:45.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.558 "is_configured": false, 00:15:45.558 "data_offset": 0, 00:15:45.558 "data_size": 0 00:15:45.558 }, 00:15:45.558 { 00:15:45.558 "name": "BaseBdev2", 00:15:45.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.558 "is_configured": false, 00:15:45.559 "data_offset": 0, 00:15:45.559 "data_size": 0 00:15:45.559 }, 00:15:45.559 { 00:15:45.559 "name": "BaseBdev3", 00:15:45.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.559 "is_configured": false, 00:15:45.559 "data_offset": 0, 00:15:45.559 "data_size": 0 00:15:45.559 }, 00:15:45.559 { 00:15:45.559 "name": "BaseBdev4", 00:15:45.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.559 "is_configured": false, 00:15:45.559 "data_offset": 0, 00:15:45.559 "data_size": 0 00:15:45.559 } 00:15:45.559 ] 00:15:45.559 }' 00:15:45.559 04:16:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:45.559 04:16:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.123 04:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.381 [2024-05-15 04:16:34.278549] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.381 [2024-05-15 04:16:34.278587] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c03040 name Existed_Raid, state configuring 00:15:46.381 04:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:46.639 [2024-05-15 04:16:34.515212] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:46.639 [2024-05-15 04:16:34.515254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:46.639 [2024-05-15 04:16:34.515273] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.639 [2024-05-15 04:16:34.515283] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.639 [2024-05-15 04:16:34.515291] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.639 [2024-05-15 04:16:34.515306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.639 [2024-05-15 04:16:34.515314] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:46.639 [2024-05-15 04:16:34.515324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:46.639 04:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:46.897 [2024-05-15 04:16:34.772374] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.897 BaseBdev1 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:46.897 04:16:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.155 04:16:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:47.413 [ 00:15:47.413 { 00:15:47.413 "name": "BaseBdev1", 00:15:47.413 "aliases": [ 00:15:47.413 "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0" 00:15:47.413 ], 00:15:47.413 "product_name": "Malloc disk", 00:15:47.413 "block_size": 512, 00:15:47.413 "num_blocks": 65536, 00:15:47.413 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:47.413 "assigned_rate_limits": { 00:15:47.413 "rw_ios_per_sec": 0, 00:15:47.413 "rw_mbytes_per_sec": 0, 00:15:47.413 "r_mbytes_per_sec": 0, 00:15:47.413 "w_mbytes_per_sec": 0 00:15:47.413 }, 00:15:47.413 "claimed": true, 00:15:47.413 "claim_type": "exclusive_write", 00:15:47.413 "zoned": false, 00:15:47.413 "supported_io_types": { 00:15:47.413 "read": true, 00:15:47.413 "write": true, 00:15:47.413 "unmap": true, 00:15:47.413 "write_zeroes": true, 00:15:47.413 "flush": true, 00:15:47.413 "reset": true, 00:15:47.413 "compare": false, 00:15:47.413 "compare_and_write": false, 00:15:47.413 "abort": true, 00:15:47.413 "nvme_admin": false, 00:15:47.413 "nvme_io": false 00:15:47.413 }, 00:15:47.413 "memory_domains": [ 00:15:47.413 { 00:15:47.413 "dma_device_id": "system", 00:15:47.413 "dma_device_type": 1 00:15:47.413 }, 00:15:47.413 { 00:15:47.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:47.413 "dma_device_type": 2 00:15:47.413 } 00:15:47.413 ], 00:15:47.413 "driver_specific": {} 00:15:47.413 } 00:15:47.413 ] 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.413 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.671 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:47.671 "name": "Existed_Raid", 00:15:47.671 "uuid": "7cb60b96-06d5-4a21-bf47-23424ce1941c", 00:15:47.671 "strip_size_kb": 64, 00:15:47.671 "state": "configuring", 00:15:47.671 "raid_level": "concat", 00:15:47.671 "superblock": true, 00:15:47.671 "num_base_bdevs": 4, 00:15:47.671 "num_base_bdevs_discovered": 1, 00:15:47.671 "num_base_bdevs_operational": 4, 00:15:47.671 "base_bdevs_list": [ 00:15:47.671 { 00:15:47.671 "name": "BaseBdev1", 00:15:47.671 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:47.671 "is_configured": true, 00:15:47.671 "data_offset": 2048, 00:15:47.671 "data_size": 63488 00:15:47.671 }, 00:15:47.671 { 00:15:47.671 "name": "BaseBdev2", 00:15:47.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.671 "is_configured": false, 00:15:47.671 "data_offset": 0, 00:15:47.671 "data_size": 0 00:15:47.671 }, 00:15:47.671 { 00:15:47.671 "name": "BaseBdev3", 00:15:47.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.671 "is_configured": false, 00:15:47.671 "data_offset": 0, 00:15:47.671 "data_size": 0 00:15:47.671 }, 00:15:47.671 { 00:15:47.671 "name": "BaseBdev4", 00:15:47.671 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.671 "is_configured": false, 00:15:47.671 "data_offset": 0, 00:15:47.671 "data_size": 0 00:15:47.671 } 00:15:47.671 ] 00:15:47.671 }' 00:15:47.671 04:16:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:47.671 04:16:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.235 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:48.235 [2024-05-15 04:16:36.236171] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:48.235 [2024-05-15 04:16:36.236230] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c028b0 name Existed_Raid, state configuring 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:15:48.493 [2024-05-15 04:16:36.468835] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.493 [2024-05-15 04:16:36.470065] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:48.493 [2024-05-15 04:16:36.470094] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:48.493 [2024-05-15 04:16:36.470114] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:48.493 [2024-05-15 04:16:36.470125] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:48.493 [2024-05-15 04:16:36.470132] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:15:48.493 [2024-05-15 04:16:36.470142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.493 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.751 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:48.751 "name": "Existed_Raid", 00:15:48.751 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:48.751 "strip_size_kb": 64, 00:15:48.751 "state": "configuring", 00:15:48.751 "raid_level": "concat", 00:15:48.751 "superblock": true, 00:15:48.751 "num_base_bdevs": 4, 00:15:48.751 "num_base_bdevs_discovered": 1, 00:15:48.751 "num_base_bdevs_operational": 4, 00:15:48.751 "base_bdevs_list": [ 00:15:48.751 { 00:15:48.751 "name": "BaseBdev1", 00:15:48.751 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:48.751 "is_configured": true, 00:15:48.751 "data_offset": 2048, 00:15:48.751 "data_size": 63488 00:15:48.751 }, 00:15:48.751 { 00:15:48.751 "name": "BaseBdev2", 00:15:48.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.751 "is_configured": false, 00:15:48.751 "data_offset": 0, 00:15:48.751 "data_size": 0 00:15:48.751 }, 00:15:48.751 { 00:15:48.751 "name": "BaseBdev3", 00:15:48.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.751 "is_configured": false, 00:15:48.751 "data_offset": 0, 00:15:48.751 "data_size": 0 00:15:48.751 }, 00:15:48.751 { 00:15:48.751 "name": "BaseBdev4", 00:15:48.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.751 "is_configured": false, 00:15:48.751 "data_offset": 0, 00:15:48.751 "data_size": 0 00:15:48.751 } 00:15:48.751 ] 00:15:48.751 }' 00:15:48.751 04:16:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:48.751 04:16:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:49.316 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:49.573 [2024-05-15 04:16:37.488568] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:49.573 BaseBdev2 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:49.573 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:49.831 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:50.088 [ 00:15:50.088 { 00:15:50.088 "name": "BaseBdev2", 00:15:50.088 "aliases": [ 00:15:50.088 "85a70256-e177-4148-8b7a-7402e659c7ae" 00:15:50.088 ], 00:15:50.088 "product_name": "Malloc disk", 00:15:50.088 "block_size": 512, 00:15:50.088 "num_blocks": 65536, 00:15:50.088 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:50.088 "assigned_rate_limits": { 00:15:50.089 "rw_ios_per_sec": 0, 00:15:50.089 "rw_mbytes_per_sec": 0, 00:15:50.089 "r_mbytes_per_sec": 0, 00:15:50.089 "w_mbytes_per_sec": 0 00:15:50.089 }, 00:15:50.089 "claimed": true, 00:15:50.089 "claim_type": "exclusive_write", 00:15:50.089 "zoned": false, 00:15:50.089 "supported_io_types": { 00:15:50.089 "read": true, 00:15:50.089 "write": true, 00:15:50.089 "unmap": true, 00:15:50.089 "write_zeroes": true, 00:15:50.089 "flush": true, 00:15:50.089 "reset": true, 00:15:50.089 "compare": false, 00:15:50.089 "compare_and_write": false, 00:15:50.089 "abort": true, 00:15:50.089 "nvme_admin": false, 00:15:50.089 "nvme_io": false 00:15:50.089 }, 00:15:50.089 "memory_domains": [ 00:15:50.089 { 00:15:50.089 "dma_device_id": "system", 00:15:50.089 "dma_device_type": 1 00:15:50.089 }, 00:15:50.089 { 00:15:50.089 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.089 "dma_device_type": 2 00:15:50.089 } 00:15:50.089 ], 00:15:50.089 "driver_specific": {} 00:15:50.089 } 00:15:50.089 ] 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.089 04:16:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:50.346 04:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:50.346 "name": "Existed_Raid", 00:15:50.346 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:50.346 "strip_size_kb": 64, 00:15:50.346 "state": "configuring", 00:15:50.346 "raid_level": "concat", 00:15:50.346 "superblock": true, 00:15:50.346 "num_base_bdevs": 4, 00:15:50.346 "num_base_bdevs_discovered": 2, 00:15:50.346 "num_base_bdevs_operational": 4, 00:15:50.346 "base_bdevs_list": [ 00:15:50.346 { 00:15:50.346 "name": "BaseBdev1", 00:15:50.346 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:50.346 "is_configured": true, 00:15:50.346 "data_offset": 2048, 00:15:50.346 "data_size": 63488 00:15:50.346 }, 00:15:50.346 { 00:15:50.346 "name": "BaseBdev2", 00:15:50.346 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:50.346 "is_configured": true, 00:15:50.346 "data_offset": 2048, 00:15:50.346 "data_size": 63488 00:15:50.346 }, 00:15:50.346 { 00:15:50.346 "name": "BaseBdev3", 00:15:50.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.346 "is_configured": false, 00:15:50.346 "data_offset": 0, 00:15:50.346 "data_size": 0 00:15:50.346 }, 00:15:50.346 { 00:15:50.346 "name": "BaseBdev4", 00:15:50.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:50.346 "is_configured": false, 00:15:50.346 "data_offset": 0, 00:15:50.346 "data_size": 0 00:15:50.346 } 00:15:50.346 ] 00:15:50.346 }' 00:15:50.346 04:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:50.346 04:16:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:50.912 04:16:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:51.170 [2024-05-15 04:16:39.009557] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:51.170 BaseBdev3 00:15:51.170 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:15:51.170 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:15:51.170 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:51.170 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:51.170 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:51.171 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:51.171 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:51.429 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:51.687 [ 00:15:51.687 { 00:15:51.687 "name": "BaseBdev3", 00:15:51.687 "aliases": [ 00:15:51.687 "5271dc78-d696-48fc-a12f-ceb0b40bcd33" 00:15:51.687 ], 00:15:51.687 "product_name": "Malloc disk", 00:15:51.687 "block_size": 512, 00:15:51.687 "num_blocks": 65536, 00:15:51.687 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:51.687 "assigned_rate_limits": { 00:15:51.687 "rw_ios_per_sec": 0, 00:15:51.687 "rw_mbytes_per_sec": 0, 00:15:51.687 "r_mbytes_per_sec": 0, 00:15:51.687 "w_mbytes_per_sec": 0 00:15:51.687 }, 00:15:51.687 "claimed": true, 00:15:51.687 "claim_type": "exclusive_write", 00:15:51.687 "zoned": false, 00:15:51.687 "supported_io_types": { 00:15:51.687 "read": true, 00:15:51.687 "write": true, 00:15:51.687 "unmap": true, 00:15:51.687 "write_zeroes": true, 00:15:51.687 "flush": true, 00:15:51.687 "reset": true, 00:15:51.687 "compare": false, 00:15:51.687 "compare_and_write": false, 00:15:51.687 "abort": true, 00:15:51.687 "nvme_admin": false, 00:15:51.687 "nvme_io": false 00:15:51.687 }, 00:15:51.687 "memory_domains": [ 00:15:51.687 { 00:15:51.687 "dma_device_id": "system", 00:15:51.687 "dma_device_type": 1 00:15:51.687 }, 00:15:51.687 { 00:15:51.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.687 "dma_device_type": 2 00:15:51.687 } 00:15:51.687 ], 00:15:51.687 "driver_specific": {} 00:15:51.687 } 00:15:51.687 ] 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:51.687 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:51.945 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:51.945 "name": "Existed_Raid", 00:15:51.945 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:51.945 "strip_size_kb": 64, 00:15:51.945 "state": "configuring", 00:15:51.945 "raid_level": "concat", 00:15:51.945 "superblock": true, 00:15:51.945 "num_base_bdevs": 4, 00:15:51.945 "num_base_bdevs_discovered": 3, 00:15:51.945 "num_base_bdevs_operational": 4, 00:15:51.945 "base_bdevs_list": [ 00:15:51.945 { 00:15:51.945 "name": "BaseBdev1", 00:15:51.945 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:51.945 "is_configured": true, 00:15:51.945 "data_offset": 2048, 00:15:51.945 "data_size": 63488 00:15:51.945 }, 00:15:51.945 { 00:15:51.945 "name": "BaseBdev2", 00:15:51.945 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:51.945 "is_configured": true, 00:15:51.945 "data_offset": 2048, 00:15:51.945 "data_size": 63488 00:15:51.945 }, 00:15:51.945 { 00:15:51.945 "name": "BaseBdev3", 00:15:51.945 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:51.945 "is_configured": true, 00:15:51.945 "data_offset": 2048, 00:15:51.945 "data_size": 63488 00:15:51.945 }, 00:15:51.945 { 00:15:51.945 "name": "BaseBdev4", 00:15:51.945 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:51.945 "is_configured": false, 00:15:51.945 "data_offset": 0, 00:15:51.945 "data_size": 0 00:15:51.945 } 00:15:51.945 ] 00:15:51.945 }' 00:15:51.945 04:16:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:51.945 04:16:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.511 04:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:15:52.769 [2024-05-15 04:16:40.575151] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:15:52.769 [2024-05-15 04:16:40.575393] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c037f0 00:15:52.769 [2024-05-15 04:16:40.575412] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:15:52.769 [2024-05-15 04:16:40.575585] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db75f0 00:15:52.769 [2024-05-15 04:16:40.575735] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c037f0 00:15:52.769 [2024-05-15 04:16:40.575752] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c037f0 00:15:52.769 [2024-05-15 04:16:40.575870] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:52.770 BaseBdev4 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:52.770 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:53.027 04:16:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:15:53.285 [ 00:15:53.285 { 00:15:53.285 "name": "BaseBdev4", 00:15:53.285 "aliases": [ 00:15:53.285 "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab" 00:15:53.285 ], 00:15:53.285 "product_name": "Malloc disk", 00:15:53.285 "block_size": 512, 00:15:53.285 "num_blocks": 65536, 00:15:53.285 "uuid": "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab", 00:15:53.285 "assigned_rate_limits": { 00:15:53.285 "rw_ios_per_sec": 0, 00:15:53.285 "rw_mbytes_per_sec": 0, 00:15:53.285 "r_mbytes_per_sec": 0, 00:15:53.285 "w_mbytes_per_sec": 0 00:15:53.285 }, 00:15:53.285 "claimed": true, 00:15:53.285 "claim_type": "exclusive_write", 00:15:53.285 "zoned": false, 00:15:53.285 "supported_io_types": { 00:15:53.285 "read": true, 00:15:53.285 "write": true, 00:15:53.285 "unmap": true, 00:15:53.285 "write_zeroes": true, 00:15:53.285 "flush": true, 00:15:53.285 "reset": true, 00:15:53.285 "compare": false, 00:15:53.285 "compare_and_write": false, 00:15:53.285 "abort": true, 00:15:53.285 "nvme_admin": false, 00:15:53.285 "nvme_io": false 00:15:53.285 }, 00:15:53.285 "memory_domains": [ 00:15:53.285 { 00:15:53.285 "dma_device_id": "system", 00:15:53.285 "dma_device_type": 1 00:15:53.285 }, 00:15:53.285 { 00:15:53.285 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:53.285 "dma_device_type": 2 00:15:53.285 } 00:15:53.285 ], 00:15:53.285 "driver_specific": {} 00:15:53.285 } 00:15:53.285 ] 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.285 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:53.543 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:53.543 "name": "Existed_Raid", 00:15:53.543 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:53.543 "strip_size_kb": 64, 00:15:53.543 "state": "online", 00:15:53.543 "raid_level": "concat", 00:15:53.543 "superblock": true, 00:15:53.543 "num_base_bdevs": 4, 00:15:53.543 "num_base_bdevs_discovered": 4, 00:15:53.543 "num_base_bdevs_operational": 4, 00:15:53.543 "base_bdevs_list": [ 00:15:53.543 { 00:15:53.543 "name": "BaseBdev1", 00:15:53.543 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:53.544 "is_configured": true, 00:15:53.544 "data_offset": 2048, 00:15:53.544 "data_size": 63488 00:15:53.544 }, 00:15:53.544 { 00:15:53.544 "name": "BaseBdev2", 00:15:53.544 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:53.544 "is_configured": true, 00:15:53.544 "data_offset": 2048, 00:15:53.544 "data_size": 63488 00:15:53.544 }, 00:15:53.544 { 00:15:53.544 "name": "BaseBdev3", 00:15:53.544 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:53.544 "is_configured": true, 00:15:53.544 "data_offset": 2048, 00:15:53.544 "data_size": 63488 00:15:53.544 }, 00:15:53.544 { 00:15:53.544 "name": "BaseBdev4", 00:15:53.544 "uuid": "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab", 00:15:53.544 "is_configured": true, 00:15:53.544 "data_offset": 2048, 00:15:53.544 "data_size": 63488 00:15:53.544 } 00:15:53.544 ] 00:15:53.544 }' 00:15:53.544 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:53.544 04:16:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:15:54.109 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:54.110 04:16:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:15:54.368 [2024-05-15 04:16:42.127554] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:15:54.368 "name": "Existed_Raid", 00:15:54.368 "aliases": [ 00:15:54.368 "5545a964-413d-4a8c-8cdf-93276d840821" 00:15:54.368 ], 00:15:54.368 "product_name": "Raid Volume", 00:15:54.368 "block_size": 512, 00:15:54.368 "num_blocks": 253952, 00:15:54.368 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:54.368 "assigned_rate_limits": { 00:15:54.368 "rw_ios_per_sec": 0, 00:15:54.368 "rw_mbytes_per_sec": 0, 00:15:54.368 "r_mbytes_per_sec": 0, 00:15:54.368 "w_mbytes_per_sec": 0 00:15:54.368 }, 00:15:54.368 "claimed": false, 00:15:54.368 "zoned": false, 00:15:54.368 "supported_io_types": { 00:15:54.368 "read": true, 00:15:54.368 "write": true, 00:15:54.368 "unmap": true, 00:15:54.368 "write_zeroes": true, 00:15:54.368 "flush": true, 00:15:54.368 "reset": true, 00:15:54.368 "compare": false, 00:15:54.368 "compare_and_write": false, 00:15:54.368 "abort": false, 00:15:54.368 "nvme_admin": false, 00:15:54.368 "nvme_io": false 00:15:54.368 }, 00:15:54.368 "memory_domains": [ 00:15:54.368 { 00:15:54.368 "dma_device_id": "system", 00:15:54.368 "dma_device_type": 1 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.368 "dma_device_type": 2 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "system", 00:15:54.368 "dma_device_type": 1 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.368 "dma_device_type": 2 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "system", 00:15:54.368 "dma_device_type": 1 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.368 "dma_device_type": 2 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "system", 00:15:54.368 "dma_device_type": 1 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.368 "dma_device_type": 2 00:15:54.368 } 00:15:54.368 ], 00:15:54.368 "driver_specific": { 00:15:54.368 "raid": { 00:15:54.368 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:54.368 "strip_size_kb": 64, 00:15:54.368 "state": "online", 00:15:54.368 "raid_level": "concat", 00:15:54.368 "superblock": true, 00:15:54.368 "num_base_bdevs": 4, 00:15:54.368 "num_base_bdevs_discovered": 4, 00:15:54.368 "num_base_bdevs_operational": 4, 00:15:54.368 "base_bdevs_list": [ 00:15:54.368 { 00:15:54.368 "name": "BaseBdev1", 00:15:54.368 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:54.368 "is_configured": true, 00:15:54.368 "data_offset": 2048, 00:15:54.368 "data_size": 63488 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "name": "BaseBdev2", 00:15:54.368 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:54.368 "is_configured": true, 00:15:54.368 "data_offset": 2048, 00:15:54.368 "data_size": 63488 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "name": "BaseBdev3", 00:15:54.368 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:54.368 "is_configured": true, 00:15:54.368 "data_offset": 2048, 00:15:54.368 "data_size": 63488 00:15:54.368 }, 00:15:54.368 { 00:15:54.368 "name": "BaseBdev4", 00:15:54.368 "uuid": "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab", 00:15:54.368 "is_configured": true, 00:15:54.368 "data_offset": 2048, 00:15:54.368 "data_size": 63488 00:15:54.368 } 00:15:54.368 ] 00:15:54.368 } 00:15:54.368 } 00:15:54.368 }' 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:15:54.368 BaseBdev2 00:15:54.368 BaseBdev3 00:15:54.368 BaseBdev4' 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:54.368 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:54.626 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:54.626 "name": "BaseBdev1", 00:15:54.626 "aliases": [ 00:15:54.626 "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0" 00:15:54.626 ], 00:15:54.626 "product_name": "Malloc disk", 00:15:54.626 "block_size": 512, 00:15:54.626 "num_blocks": 65536, 00:15:54.626 "uuid": "3c2e10dd-1e9f-45dc-87b6-cba0ae0e1cb0", 00:15:54.626 "assigned_rate_limits": { 00:15:54.627 "rw_ios_per_sec": 0, 00:15:54.627 "rw_mbytes_per_sec": 0, 00:15:54.627 "r_mbytes_per_sec": 0, 00:15:54.627 "w_mbytes_per_sec": 0 00:15:54.627 }, 00:15:54.627 "claimed": true, 00:15:54.627 "claim_type": "exclusive_write", 00:15:54.627 "zoned": false, 00:15:54.627 "supported_io_types": { 00:15:54.627 "read": true, 00:15:54.627 "write": true, 00:15:54.627 "unmap": true, 00:15:54.627 "write_zeroes": true, 00:15:54.627 "flush": true, 00:15:54.627 "reset": true, 00:15:54.627 "compare": false, 00:15:54.627 "compare_and_write": false, 00:15:54.627 "abort": true, 00:15:54.627 "nvme_admin": false, 00:15:54.627 "nvme_io": false 00:15:54.627 }, 00:15:54.627 "memory_domains": [ 00:15:54.627 { 00:15:54.627 "dma_device_id": "system", 00:15:54.627 "dma_device_type": 1 00:15:54.627 }, 00:15:54.627 { 00:15:54.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.627 "dma_device_type": 2 00:15:54.627 } 00:15:54.627 ], 00:15:54.627 "driver_specific": {} 00:15:54.627 }' 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:54.627 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:54.885 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:55.143 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:55.143 "name": "BaseBdev2", 00:15:55.143 "aliases": [ 00:15:55.143 "85a70256-e177-4148-8b7a-7402e659c7ae" 00:15:55.143 ], 00:15:55.143 "product_name": "Malloc disk", 00:15:55.143 "block_size": 512, 00:15:55.143 "num_blocks": 65536, 00:15:55.143 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:55.143 "assigned_rate_limits": { 00:15:55.143 "rw_ios_per_sec": 0, 00:15:55.143 "rw_mbytes_per_sec": 0, 00:15:55.143 "r_mbytes_per_sec": 0, 00:15:55.143 "w_mbytes_per_sec": 0 00:15:55.143 }, 00:15:55.143 "claimed": true, 00:15:55.143 "claim_type": "exclusive_write", 00:15:55.143 "zoned": false, 00:15:55.143 "supported_io_types": { 00:15:55.143 "read": true, 00:15:55.143 "write": true, 00:15:55.143 "unmap": true, 00:15:55.143 "write_zeroes": true, 00:15:55.143 "flush": true, 00:15:55.143 "reset": true, 00:15:55.143 "compare": false, 00:15:55.143 "compare_and_write": false, 00:15:55.143 "abort": true, 00:15:55.143 "nvme_admin": false, 00:15:55.143 "nvme_io": false 00:15:55.143 }, 00:15:55.143 "memory_domains": [ 00:15:55.143 { 00:15:55.143 "dma_device_id": "system", 00:15:55.143 "dma_device_type": 1 00:15:55.143 }, 00:15:55.143 { 00:15:55.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.143 "dma_device_type": 2 00:15:55.143 } 00:15:55.143 ], 00:15:55.143 "driver_specific": {} 00:15:55.143 }' 00:15:55.143 04:16:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.143 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:55.401 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:55.662 "name": "BaseBdev3", 00:15:55.662 "aliases": [ 00:15:55.662 "5271dc78-d696-48fc-a12f-ceb0b40bcd33" 00:15:55.662 ], 00:15:55.662 "product_name": "Malloc disk", 00:15:55.662 "block_size": 512, 00:15:55.662 "num_blocks": 65536, 00:15:55.662 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:55.662 "assigned_rate_limits": { 00:15:55.662 "rw_ios_per_sec": 0, 00:15:55.662 "rw_mbytes_per_sec": 0, 00:15:55.662 "r_mbytes_per_sec": 0, 00:15:55.662 "w_mbytes_per_sec": 0 00:15:55.662 }, 00:15:55.662 "claimed": true, 00:15:55.662 "claim_type": "exclusive_write", 00:15:55.662 "zoned": false, 00:15:55.662 "supported_io_types": { 00:15:55.662 "read": true, 00:15:55.662 "write": true, 00:15:55.662 "unmap": true, 00:15:55.662 "write_zeroes": true, 00:15:55.662 "flush": true, 00:15:55.662 "reset": true, 00:15:55.662 "compare": false, 00:15:55.662 "compare_and_write": false, 00:15:55.662 "abort": true, 00:15:55.662 "nvme_admin": false, 00:15:55.662 "nvme_io": false 00:15:55.662 }, 00:15:55.662 "memory_domains": [ 00:15:55.662 { 00:15:55.662 "dma_device_id": "system", 00:15:55.662 "dma_device_type": 1 00:15:55.662 }, 00:15:55.662 { 00:15:55.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.662 "dma_device_type": 2 00:15:55.662 } 00:15:55.662 ], 00:15:55.662 "driver_specific": {} 00:15:55.662 }' 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:55.662 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:15:55.945 04:16:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:15:56.231 "name": "BaseBdev4", 00:15:56.231 "aliases": [ 00:15:56.231 "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab" 00:15:56.231 ], 00:15:56.231 "product_name": "Malloc disk", 00:15:56.231 "block_size": 512, 00:15:56.231 "num_blocks": 65536, 00:15:56.231 "uuid": "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab", 00:15:56.231 "assigned_rate_limits": { 00:15:56.231 "rw_ios_per_sec": 0, 00:15:56.231 "rw_mbytes_per_sec": 0, 00:15:56.231 "r_mbytes_per_sec": 0, 00:15:56.231 "w_mbytes_per_sec": 0 00:15:56.231 }, 00:15:56.231 "claimed": true, 00:15:56.231 "claim_type": "exclusive_write", 00:15:56.231 "zoned": false, 00:15:56.231 "supported_io_types": { 00:15:56.231 "read": true, 00:15:56.231 "write": true, 00:15:56.231 "unmap": true, 00:15:56.231 "write_zeroes": true, 00:15:56.231 "flush": true, 00:15:56.231 "reset": true, 00:15:56.231 "compare": false, 00:15:56.231 "compare_and_write": false, 00:15:56.231 "abort": true, 00:15:56.231 "nvme_admin": false, 00:15:56.231 "nvme_io": false 00:15:56.231 }, 00:15:56.231 "memory_domains": [ 00:15:56.231 { 00:15:56.231 "dma_device_id": "system", 00:15:56.231 "dma_device_type": 1 00:15:56.231 }, 00:15:56.231 { 00:15:56.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:56.231 "dma_device_type": 2 00:15:56.231 } 00:15:56.231 ], 00:15:56.231 "driver_specific": {} 00:15:56.231 }' 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:56.231 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:15:56.489 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:56.489 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:56.489 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:15:56.489 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:15:56.489 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:56.747 [2024-05-15 04:16:44.565777] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:56.747 [2024-05-15 04:16:44.565807] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:56.747 [2024-05-15 04:16:44.565889] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy concat 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@216 -- # return 1 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@278 -- # expected_state=offline 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=offline 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.747 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:57.005 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:15:57.005 "name": "Existed_Raid", 00:15:57.005 "uuid": "5545a964-413d-4a8c-8cdf-93276d840821", 00:15:57.005 "strip_size_kb": 64, 00:15:57.005 "state": "offline", 00:15:57.005 "raid_level": "concat", 00:15:57.005 "superblock": true, 00:15:57.005 "num_base_bdevs": 4, 00:15:57.005 "num_base_bdevs_discovered": 3, 00:15:57.005 "num_base_bdevs_operational": 3, 00:15:57.005 "base_bdevs_list": [ 00:15:57.005 { 00:15:57.005 "name": null, 00:15:57.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:57.005 "is_configured": false, 00:15:57.005 "data_offset": 2048, 00:15:57.005 "data_size": 63488 00:15:57.005 }, 00:15:57.005 { 00:15:57.005 "name": "BaseBdev2", 00:15:57.005 "uuid": "85a70256-e177-4148-8b7a-7402e659c7ae", 00:15:57.005 "is_configured": true, 00:15:57.005 "data_offset": 2048, 00:15:57.005 "data_size": 63488 00:15:57.005 }, 00:15:57.005 { 00:15:57.005 "name": "BaseBdev3", 00:15:57.005 "uuid": "5271dc78-d696-48fc-a12f-ceb0b40bcd33", 00:15:57.005 "is_configured": true, 00:15:57.005 "data_offset": 2048, 00:15:57.005 "data_size": 63488 00:15:57.005 }, 00:15:57.005 { 00:15:57.005 "name": "BaseBdev4", 00:15:57.005 "uuid": "263c1d3e-7d44-42f9-b543-ac9ec5cad8ab", 00:15:57.005 "is_configured": true, 00:15:57.005 "data_offset": 2048, 00:15:57.005 "data_size": 63488 00:15:57.005 } 00:15:57.005 ] 00:15:57.005 }' 00:15:57.005 04:16:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:15:57.005 04:16:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:57.571 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:15:57.572 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:57.572 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.572 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:57.831 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:57.831 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:57.831 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:58.089 [2024-05-15 04:16:45.927484] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:58.089 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:58.089 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:58.089 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.089 04:16:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:58.347 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:58.347 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:58.347 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:58.605 [2024-05-15 04:16:46.437940] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:58.605 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:58.605 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:58.605 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.605 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:15:58.863 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:15:58.863 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:58.863 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:15:59.121 [2024-05-15 04:16:46.944332] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:15:59.121 [2024-05-15 04:16:46.944384] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c037f0 name Existed_Raid, state offline 00:15:59.121 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:15:59.121 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:15:59.121 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.121 04:16:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:15:59.379 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:59.636 BaseBdev2 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:15:59.636 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:59.894 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:00.152 [ 00:16:00.152 { 00:16:00.152 "name": "BaseBdev2", 00:16:00.152 "aliases": [ 00:16:00.152 "7cf016cc-6c5f-4dd9-8aad-92eebce772de" 00:16:00.152 ], 00:16:00.152 "product_name": "Malloc disk", 00:16:00.152 "block_size": 512, 00:16:00.152 "num_blocks": 65536, 00:16:00.152 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:00.152 "assigned_rate_limits": { 00:16:00.152 "rw_ios_per_sec": 0, 00:16:00.152 "rw_mbytes_per_sec": 0, 00:16:00.152 "r_mbytes_per_sec": 0, 00:16:00.152 "w_mbytes_per_sec": 0 00:16:00.152 }, 00:16:00.152 "claimed": false, 00:16:00.152 "zoned": false, 00:16:00.152 "supported_io_types": { 00:16:00.152 "read": true, 00:16:00.152 "write": true, 00:16:00.152 "unmap": true, 00:16:00.152 "write_zeroes": true, 00:16:00.152 "flush": true, 00:16:00.152 "reset": true, 00:16:00.152 "compare": false, 00:16:00.152 "compare_and_write": false, 00:16:00.152 "abort": true, 00:16:00.152 "nvme_admin": false, 00:16:00.152 "nvme_io": false 00:16:00.152 }, 00:16:00.152 "memory_domains": [ 00:16:00.152 { 00:16:00.152 "dma_device_id": "system", 00:16:00.152 "dma_device_type": 1 00:16:00.152 }, 00:16:00.152 { 00:16:00.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.152 "dma_device_type": 2 00:16:00.152 } 00:16:00.152 ], 00:16:00.152 "driver_specific": {} 00:16:00.152 } 00:16:00.152 ] 00:16:00.152 04:16:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:00.152 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:00.152 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:00.152 04:16:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:00.410 BaseBdev3 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:00.410 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.668 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:00.940 [ 00:16:00.940 { 00:16:00.940 "name": "BaseBdev3", 00:16:00.940 "aliases": [ 00:16:00.940 "aeea2783-a9bd-4d46-ac13-2381a9f484f4" 00:16:00.940 ], 00:16:00.940 "product_name": "Malloc disk", 00:16:00.940 "block_size": 512, 00:16:00.940 "num_blocks": 65536, 00:16:00.940 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:00.940 "assigned_rate_limits": { 00:16:00.940 "rw_ios_per_sec": 0, 00:16:00.940 "rw_mbytes_per_sec": 0, 00:16:00.940 "r_mbytes_per_sec": 0, 00:16:00.940 "w_mbytes_per_sec": 0 00:16:00.940 }, 00:16:00.940 "claimed": false, 00:16:00.940 "zoned": false, 00:16:00.940 "supported_io_types": { 00:16:00.940 "read": true, 00:16:00.940 "write": true, 00:16:00.940 "unmap": true, 00:16:00.940 "write_zeroes": true, 00:16:00.940 "flush": true, 00:16:00.940 "reset": true, 00:16:00.940 "compare": false, 00:16:00.940 "compare_and_write": false, 00:16:00.940 "abort": true, 00:16:00.940 "nvme_admin": false, 00:16:00.940 "nvme_io": false 00:16:00.940 }, 00:16:00.940 "memory_domains": [ 00:16:00.940 { 00:16:00.940 "dma_device_id": "system", 00:16:00.940 "dma_device_type": 1 00:16:00.940 }, 00:16:00.940 { 00:16:00.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.940 "dma_device_type": 2 00:16:00.940 } 00:16:00.940 ], 00:16:00.940 "driver_specific": {} 00:16:00.940 } 00:16:00.940 ] 00:16:00.940 04:16:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:00.940 04:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:00.940 04:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:00.940 04:16:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:01.198 BaseBdev4 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:01.198 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:01.457 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:01.716 [ 00:16:01.716 { 00:16:01.716 "name": "BaseBdev4", 00:16:01.716 "aliases": [ 00:16:01.716 "9aac17b0-178f-460d-8dc3-c876e19960bf" 00:16:01.716 ], 00:16:01.716 "product_name": "Malloc disk", 00:16:01.716 "block_size": 512, 00:16:01.716 "num_blocks": 65536, 00:16:01.716 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:01.716 "assigned_rate_limits": { 00:16:01.716 "rw_ios_per_sec": 0, 00:16:01.716 "rw_mbytes_per_sec": 0, 00:16:01.716 "r_mbytes_per_sec": 0, 00:16:01.716 "w_mbytes_per_sec": 0 00:16:01.716 }, 00:16:01.716 "claimed": false, 00:16:01.716 "zoned": false, 00:16:01.716 "supported_io_types": { 00:16:01.716 "read": true, 00:16:01.716 "write": true, 00:16:01.716 "unmap": true, 00:16:01.716 "write_zeroes": true, 00:16:01.716 "flush": true, 00:16:01.716 "reset": true, 00:16:01.716 "compare": false, 00:16:01.716 "compare_and_write": false, 00:16:01.716 "abort": true, 00:16:01.716 "nvme_admin": false, 00:16:01.716 "nvme_io": false 00:16:01.716 }, 00:16:01.716 "memory_domains": [ 00:16:01.716 { 00:16:01.716 "dma_device_id": "system", 00:16:01.716 "dma_device_type": 1 00:16:01.716 }, 00:16:01.716 { 00:16:01.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:01.716 "dma_device_type": 2 00:16:01.716 } 00:16:01.716 ], 00:16:01.716 "driver_specific": {} 00:16:01.716 } 00:16:01.716 ] 00:16:01.716 04:16:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:01.716 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:01.716 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:01.716 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:01.975 [2024-05-15 04:16:49.905419] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:01.975 [2024-05-15 04:16:49.905464] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:01.975 [2024-05-15 04:16:49.905502] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:01.975 [2024-05-15 04:16:49.906941] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:01.975 [2024-05-15 04:16:49.906992] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.975 04:16:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:02.233 04:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:02.233 "name": "Existed_Raid", 00:16:02.233 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:02.233 "strip_size_kb": 64, 00:16:02.233 "state": "configuring", 00:16:02.233 "raid_level": "concat", 00:16:02.233 "superblock": true, 00:16:02.233 "num_base_bdevs": 4, 00:16:02.233 "num_base_bdevs_discovered": 3, 00:16:02.233 "num_base_bdevs_operational": 4, 00:16:02.233 "base_bdevs_list": [ 00:16:02.233 { 00:16:02.233 "name": "BaseBdev1", 00:16:02.233 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.233 "is_configured": false, 00:16:02.233 "data_offset": 0, 00:16:02.233 "data_size": 0 00:16:02.233 }, 00:16:02.233 { 00:16:02.233 "name": "BaseBdev2", 00:16:02.233 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:02.233 "is_configured": true, 00:16:02.233 "data_offset": 2048, 00:16:02.233 "data_size": 63488 00:16:02.233 }, 00:16:02.233 { 00:16:02.233 "name": "BaseBdev3", 00:16:02.233 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:02.233 "is_configured": true, 00:16:02.233 "data_offset": 2048, 00:16:02.233 "data_size": 63488 00:16:02.233 }, 00:16:02.233 { 00:16:02.233 "name": "BaseBdev4", 00:16:02.233 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:02.233 "is_configured": true, 00:16:02.233 "data_offset": 2048, 00:16:02.233 "data_size": 63488 00:16:02.233 } 00:16:02.233 ] 00:16:02.233 }' 00:16:02.233 04:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:02.233 04:16:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:02.799 04:16:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:03.058 [2024-05-15 04:16:51.004279] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.058 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.316 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:03.316 "name": "Existed_Raid", 00:16:03.316 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:03.316 "strip_size_kb": 64, 00:16:03.316 "state": "configuring", 00:16:03.316 "raid_level": "concat", 00:16:03.316 "superblock": true, 00:16:03.316 "num_base_bdevs": 4, 00:16:03.316 "num_base_bdevs_discovered": 2, 00:16:03.316 "num_base_bdevs_operational": 4, 00:16:03.316 "base_bdevs_list": [ 00:16:03.316 { 00:16:03.316 "name": "BaseBdev1", 00:16:03.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.316 "is_configured": false, 00:16:03.316 "data_offset": 0, 00:16:03.316 "data_size": 0 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": null, 00:16:03.316 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:03.316 "is_configured": false, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": "BaseBdev3", 00:16:03.316 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:03.316 "is_configured": true, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 }, 00:16:03.316 { 00:16:03.316 "name": "BaseBdev4", 00:16:03.316 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:03.316 "is_configured": true, 00:16:03.316 "data_offset": 2048, 00:16:03.316 "data_size": 63488 00:16:03.316 } 00:16:03.316 ] 00:16:03.316 }' 00:16:03.316 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:03.316 04:16:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.880 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.880 04:16:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:04.137 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:04.137 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:04.395 [2024-05-15 04:16:52.313326] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:04.395 BaseBdev1 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:04.395 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.654 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:04.912 [ 00:16:04.912 { 00:16:04.912 "name": "BaseBdev1", 00:16:04.912 "aliases": [ 00:16:04.912 "8edf665d-feb9-4abb-afc6-ad91a3e03f13" 00:16:04.912 ], 00:16:04.912 "product_name": "Malloc disk", 00:16:04.912 "block_size": 512, 00:16:04.912 "num_blocks": 65536, 00:16:04.912 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:04.912 "assigned_rate_limits": { 00:16:04.912 "rw_ios_per_sec": 0, 00:16:04.912 "rw_mbytes_per_sec": 0, 00:16:04.912 "r_mbytes_per_sec": 0, 00:16:04.912 "w_mbytes_per_sec": 0 00:16:04.912 }, 00:16:04.912 "claimed": true, 00:16:04.912 "claim_type": "exclusive_write", 00:16:04.912 "zoned": false, 00:16:04.912 "supported_io_types": { 00:16:04.912 "read": true, 00:16:04.912 "write": true, 00:16:04.912 "unmap": true, 00:16:04.912 "write_zeroes": true, 00:16:04.912 "flush": true, 00:16:04.912 "reset": true, 00:16:04.912 "compare": false, 00:16:04.912 "compare_and_write": false, 00:16:04.912 "abort": true, 00:16:04.912 "nvme_admin": false, 00:16:04.912 "nvme_io": false 00:16:04.912 }, 00:16:04.912 "memory_domains": [ 00:16:04.912 { 00:16:04.912 "dma_device_id": "system", 00:16:04.912 "dma_device_type": 1 00:16:04.912 }, 00:16:04.912 { 00:16:04.912 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.912 "dma_device_type": 2 00:16:04.912 } 00:16:04.912 ], 00:16:04.912 "driver_specific": {} 00:16:04.912 } 00:16:04.912 ] 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:04.912 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.913 04:16:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.171 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:05.171 "name": "Existed_Raid", 00:16:05.171 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:05.171 "strip_size_kb": 64, 00:16:05.171 "state": "configuring", 00:16:05.171 "raid_level": "concat", 00:16:05.171 "superblock": true, 00:16:05.171 "num_base_bdevs": 4, 00:16:05.171 "num_base_bdevs_discovered": 3, 00:16:05.171 "num_base_bdevs_operational": 4, 00:16:05.171 "base_bdevs_list": [ 00:16:05.171 { 00:16:05.171 "name": "BaseBdev1", 00:16:05.171 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:05.171 "is_configured": true, 00:16:05.171 "data_offset": 2048, 00:16:05.171 "data_size": 63488 00:16:05.171 }, 00:16:05.171 { 00:16:05.171 "name": null, 00:16:05.171 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:05.171 "is_configured": false, 00:16:05.171 "data_offset": 2048, 00:16:05.171 "data_size": 63488 00:16:05.171 }, 00:16:05.171 { 00:16:05.171 "name": "BaseBdev3", 00:16:05.171 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:05.171 "is_configured": true, 00:16:05.171 "data_offset": 2048, 00:16:05.171 "data_size": 63488 00:16:05.171 }, 00:16:05.171 { 00:16:05.171 "name": "BaseBdev4", 00:16:05.171 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:05.171 "is_configured": true, 00:16:05.171 "data_offset": 2048, 00:16:05.171 "data_size": 63488 00:16:05.171 } 00:16:05.171 ] 00:16:05.171 }' 00:16:05.171 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:05.171 04:16:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.737 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.737 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:05.995 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:05.996 04:16:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:06.253 [2024-05-15 04:16:54.025841] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.253 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:06.511 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:06.511 "name": "Existed_Raid", 00:16:06.511 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:06.511 "strip_size_kb": 64, 00:16:06.511 "state": "configuring", 00:16:06.511 "raid_level": "concat", 00:16:06.511 "superblock": true, 00:16:06.511 "num_base_bdevs": 4, 00:16:06.511 "num_base_bdevs_discovered": 2, 00:16:06.511 "num_base_bdevs_operational": 4, 00:16:06.511 "base_bdevs_list": [ 00:16:06.511 { 00:16:06.511 "name": "BaseBdev1", 00:16:06.511 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:06.511 "is_configured": true, 00:16:06.511 "data_offset": 2048, 00:16:06.511 "data_size": 63488 00:16:06.511 }, 00:16:06.511 { 00:16:06.511 "name": null, 00:16:06.511 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:06.511 "is_configured": false, 00:16:06.511 "data_offset": 2048, 00:16:06.511 "data_size": 63488 00:16:06.511 }, 00:16:06.511 { 00:16:06.511 "name": null, 00:16:06.511 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:06.511 "is_configured": false, 00:16:06.511 "data_offset": 2048, 00:16:06.511 "data_size": 63488 00:16:06.511 }, 00:16:06.511 { 00:16:06.511 "name": "BaseBdev4", 00:16:06.511 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:06.511 "is_configured": true, 00:16:06.511 "data_offset": 2048, 00:16:06.511 "data_size": 63488 00:16:06.511 } 00:16:06.511 ] 00:16:06.511 }' 00:16:06.511 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:06.511 04:16:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:07.074 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.074 04:16:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:07.074 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:07.074 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:07.332 [2024-05-15 04:16:55.265110] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.332 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.589 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:07.589 "name": "Existed_Raid", 00:16:07.589 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:07.589 "strip_size_kb": 64, 00:16:07.589 "state": "configuring", 00:16:07.589 "raid_level": "concat", 00:16:07.589 "superblock": true, 00:16:07.589 "num_base_bdevs": 4, 00:16:07.589 "num_base_bdevs_discovered": 3, 00:16:07.589 "num_base_bdevs_operational": 4, 00:16:07.589 "base_bdevs_list": [ 00:16:07.589 { 00:16:07.589 "name": "BaseBdev1", 00:16:07.589 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:07.589 "is_configured": true, 00:16:07.589 "data_offset": 2048, 00:16:07.589 "data_size": 63488 00:16:07.589 }, 00:16:07.589 { 00:16:07.589 "name": null, 00:16:07.589 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:07.589 "is_configured": false, 00:16:07.589 "data_offset": 2048, 00:16:07.589 "data_size": 63488 00:16:07.589 }, 00:16:07.589 { 00:16:07.589 "name": "BaseBdev3", 00:16:07.589 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:07.589 "is_configured": true, 00:16:07.589 "data_offset": 2048, 00:16:07.589 "data_size": 63488 00:16:07.589 }, 00:16:07.589 { 00:16:07.589 "name": "BaseBdev4", 00:16:07.589 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:07.589 "is_configured": true, 00:16:07.589 "data_offset": 2048, 00:16:07.589 "data_size": 63488 00:16:07.589 } 00:16:07.589 ] 00:16:07.589 }' 00:16:07.589 04:16:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:07.589 04:16:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.154 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.154 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:08.412 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:08.412 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.670 [2024-05-15 04:16:56.484392] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.670 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.928 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:08.928 "name": "Existed_Raid", 00:16:08.928 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:08.928 "strip_size_kb": 64, 00:16:08.928 "state": "configuring", 00:16:08.928 "raid_level": "concat", 00:16:08.928 "superblock": true, 00:16:08.928 "num_base_bdevs": 4, 00:16:08.928 "num_base_bdevs_discovered": 2, 00:16:08.928 "num_base_bdevs_operational": 4, 00:16:08.928 "base_bdevs_list": [ 00:16:08.928 { 00:16:08.928 "name": null, 00:16:08.928 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:08.928 "is_configured": false, 00:16:08.928 "data_offset": 2048, 00:16:08.928 "data_size": 63488 00:16:08.928 }, 00:16:08.928 { 00:16:08.928 "name": null, 00:16:08.928 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:08.928 "is_configured": false, 00:16:08.928 "data_offset": 2048, 00:16:08.928 "data_size": 63488 00:16:08.928 }, 00:16:08.928 { 00:16:08.928 "name": "BaseBdev3", 00:16:08.928 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:08.928 "is_configured": true, 00:16:08.928 "data_offset": 2048, 00:16:08.928 "data_size": 63488 00:16:08.928 }, 00:16:08.928 { 00:16:08.928 "name": "BaseBdev4", 00:16:08.928 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:08.928 "is_configured": true, 00:16:08.928 "data_offset": 2048, 00:16:08.928 "data_size": 63488 00:16:08.928 } 00:16:08.928 ] 00:16:08.928 }' 00:16:08.928 04:16:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:08.928 04:16:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:09.493 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.493 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:09.750 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:09.750 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:09.750 [2024-05-15 04:16:57.755687] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.008 04:16:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.008 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:10.008 "name": "Existed_Raid", 00:16:10.008 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:10.008 "strip_size_kb": 64, 00:16:10.008 "state": "configuring", 00:16:10.008 "raid_level": "concat", 00:16:10.008 "superblock": true, 00:16:10.008 "num_base_bdevs": 4, 00:16:10.008 "num_base_bdevs_discovered": 3, 00:16:10.008 "num_base_bdevs_operational": 4, 00:16:10.008 "base_bdevs_list": [ 00:16:10.008 { 00:16:10.008 "name": null, 00:16:10.008 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:10.008 "is_configured": false, 00:16:10.008 "data_offset": 2048, 00:16:10.008 "data_size": 63488 00:16:10.008 }, 00:16:10.008 { 00:16:10.008 "name": "BaseBdev2", 00:16:10.008 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:10.008 "is_configured": true, 00:16:10.008 "data_offset": 2048, 00:16:10.008 "data_size": 63488 00:16:10.008 }, 00:16:10.008 { 00:16:10.008 "name": "BaseBdev3", 00:16:10.008 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:10.008 "is_configured": true, 00:16:10.008 "data_offset": 2048, 00:16:10.008 "data_size": 63488 00:16:10.008 }, 00:16:10.008 { 00:16:10.008 "name": "BaseBdev4", 00:16:10.008 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:10.008 "is_configured": true, 00:16:10.008 "data_offset": 2048, 00:16:10.008 "data_size": 63488 00:16:10.008 } 00:16:10.008 ] 00:16:10.008 }' 00:16:10.008 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:10.008 04:16:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.573 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.573 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:10.830 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:10.830 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.830 04:16:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:11.088 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8edf665d-feb9-4abb-afc6-ad91a3e03f13 00:16:11.346 [2024-05-15 04:16:59.336625] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:11.346 [2024-05-15 04:16:59.336878] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c04cd0 00:16:11.346 [2024-05-15 04:16:59.336893] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:11.346 [2024-05-15 04:16:59.337043] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db59e0 00:16:11.346 [2024-05-15 04:16:59.337171] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c04cd0 00:16:11.346 [2024-05-15 04:16:59.337184] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c04cd0 00:16:11.346 [2024-05-15 04:16:59.337276] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:11.346 NewBaseBdev 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:11.346 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.604 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:11.861 [ 00:16:11.861 { 00:16:11.861 "name": "NewBaseBdev", 00:16:11.861 "aliases": [ 00:16:11.861 "8edf665d-feb9-4abb-afc6-ad91a3e03f13" 00:16:11.861 ], 00:16:11.861 "product_name": "Malloc disk", 00:16:11.861 "block_size": 512, 00:16:11.861 "num_blocks": 65536, 00:16:11.861 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:11.861 "assigned_rate_limits": { 00:16:11.862 "rw_ios_per_sec": 0, 00:16:11.862 "rw_mbytes_per_sec": 0, 00:16:11.862 "r_mbytes_per_sec": 0, 00:16:11.862 "w_mbytes_per_sec": 0 00:16:11.862 }, 00:16:11.862 "claimed": true, 00:16:11.862 "claim_type": "exclusive_write", 00:16:11.862 "zoned": false, 00:16:11.862 "supported_io_types": { 00:16:11.862 "read": true, 00:16:11.862 "write": true, 00:16:11.862 "unmap": true, 00:16:11.862 "write_zeroes": true, 00:16:11.862 "flush": true, 00:16:11.862 "reset": true, 00:16:11.862 "compare": false, 00:16:11.862 "compare_and_write": false, 00:16:11.862 "abort": true, 00:16:11.862 "nvme_admin": false, 00:16:11.862 "nvme_io": false 00:16:11.862 }, 00:16:11.862 "memory_domains": [ 00:16:11.862 { 00:16:11.862 "dma_device_id": "system", 00:16:11.862 "dma_device_type": 1 00:16:11.862 }, 00:16:11.862 { 00:16:11.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.862 "dma_device_type": 2 00:16:11.862 } 00:16:11.862 ], 00:16:11.862 "driver_specific": {} 00:16:11.862 } 00:16:11.862 ] 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.862 04:16:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.119 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:12.119 "name": "Existed_Raid", 00:16:12.119 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:12.119 "strip_size_kb": 64, 00:16:12.119 "state": "online", 00:16:12.119 "raid_level": "concat", 00:16:12.119 "superblock": true, 00:16:12.120 "num_base_bdevs": 4, 00:16:12.120 "num_base_bdevs_discovered": 4, 00:16:12.120 "num_base_bdevs_operational": 4, 00:16:12.120 "base_bdevs_list": [ 00:16:12.120 { 00:16:12.120 "name": "NewBaseBdev", 00:16:12.120 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:12.120 "is_configured": true, 00:16:12.120 "data_offset": 2048, 00:16:12.120 "data_size": 63488 00:16:12.120 }, 00:16:12.120 { 00:16:12.120 "name": "BaseBdev2", 00:16:12.120 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:12.120 "is_configured": true, 00:16:12.120 "data_offset": 2048, 00:16:12.120 "data_size": 63488 00:16:12.120 }, 00:16:12.120 { 00:16:12.120 "name": "BaseBdev3", 00:16:12.120 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:12.120 "is_configured": true, 00:16:12.120 "data_offset": 2048, 00:16:12.120 "data_size": 63488 00:16:12.120 }, 00:16:12.120 { 00:16:12.120 "name": "BaseBdev4", 00:16:12.120 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:12.120 "is_configured": true, 00:16:12.120 "data_offset": 2048, 00:16:12.120 "data_size": 63488 00:16:12.120 } 00:16:12.120 ] 00:16:12.120 }' 00:16:12.120 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:12.120 04:17:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:12.685 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:12.943 [2024-05-15 04:17:00.820836] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:12.943 "name": "Existed_Raid", 00:16:12.943 "aliases": [ 00:16:12.943 "80db366d-999c-48bb-bc97-53d459d5d8ab" 00:16:12.943 ], 00:16:12.943 "product_name": "Raid Volume", 00:16:12.943 "block_size": 512, 00:16:12.943 "num_blocks": 253952, 00:16:12.943 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:12.943 "assigned_rate_limits": { 00:16:12.943 "rw_ios_per_sec": 0, 00:16:12.943 "rw_mbytes_per_sec": 0, 00:16:12.943 "r_mbytes_per_sec": 0, 00:16:12.943 "w_mbytes_per_sec": 0 00:16:12.943 }, 00:16:12.943 "claimed": false, 00:16:12.943 "zoned": false, 00:16:12.943 "supported_io_types": { 00:16:12.943 "read": true, 00:16:12.943 "write": true, 00:16:12.943 "unmap": true, 00:16:12.943 "write_zeroes": true, 00:16:12.943 "flush": true, 00:16:12.943 "reset": true, 00:16:12.943 "compare": false, 00:16:12.943 "compare_and_write": false, 00:16:12.943 "abort": false, 00:16:12.943 "nvme_admin": false, 00:16:12.943 "nvme_io": false 00:16:12.943 }, 00:16:12.943 "memory_domains": [ 00:16:12.943 { 00:16:12.943 "dma_device_id": "system", 00:16:12.943 "dma_device_type": 1 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.943 "dma_device_type": 2 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "system", 00:16:12.943 "dma_device_type": 1 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.943 "dma_device_type": 2 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "system", 00:16:12.943 "dma_device_type": 1 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.943 "dma_device_type": 2 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "system", 00:16:12.943 "dma_device_type": 1 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:12.943 "dma_device_type": 2 00:16:12.943 } 00:16:12.943 ], 00:16:12.943 "driver_specific": { 00:16:12.943 "raid": { 00:16:12.943 "uuid": "80db366d-999c-48bb-bc97-53d459d5d8ab", 00:16:12.943 "strip_size_kb": 64, 00:16:12.943 "state": "online", 00:16:12.943 "raid_level": "concat", 00:16:12.943 "superblock": true, 00:16:12.943 "num_base_bdevs": 4, 00:16:12.943 "num_base_bdevs_discovered": 4, 00:16:12.943 "num_base_bdevs_operational": 4, 00:16:12.943 "base_bdevs_list": [ 00:16:12.943 { 00:16:12.943 "name": "NewBaseBdev", 00:16:12.943 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:12.943 "is_configured": true, 00:16:12.943 "data_offset": 2048, 00:16:12.943 "data_size": 63488 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "name": "BaseBdev2", 00:16:12.943 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:12.943 "is_configured": true, 00:16:12.943 "data_offset": 2048, 00:16:12.943 "data_size": 63488 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "name": "BaseBdev3", 00:16:12.943 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:12.943 "is_configured": true, 00:16:12.943 "data_offset": 2048, 00:16:12.943 "data_size": 63488 00:16:12.943 }, 00:16:12.943 { 00:16:12.943 "name": "BaseBdev4", 00:16:12.943 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:12.943 "is_configured": true, 00:16:12.943 "data_offset": 2048, 00:16:12.943 "data_size": 63488 00:16:12.943 } 00:16:12.943 ] 00:16:12.943 } 00:16:12.943 } 00:16:12.943 }' 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:16:12.943 BaseBdev2 00:16:12.943 BaseBdev3 00:16:12.943 BaseBdev4' 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:12.943 04:17:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:13.200 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:13.200 "name": "NewBaseBdev", 00:16:13.200 "aliases": [ 00:16:13.200 "8edf665d-feb9-4abb-afc6-ad91a3e03f13" 00:16:13.200 ], 00:16:13.200 "product_name": "Malloc disk", 00:16:13.200 "block_size": 512, 00:16:13.200 "num_blocks": 65536, 00:16:13.200 "uuid": "8edf665d-feb9-4abb-afc6-ad91a3e03f13", 00:16:13.200 "assigned_rate_limits": { 00:16:13.200 "rw_ios_per_sec": 0, 00:16:13.200 "rw_mbytes_per_sec": 0, 00:16:13.200 "r_mbytes_per_sec": 0, 00:16:13.200 "w_mbytes_per_sec": 0 00:16:13.200 }, 00:16:13.200 "claimed": true, 00:16:13.200 "claim_type": "exclusive_write", 00:16:13.200 "zoned": false, 00:16:13.200 "supported_io_types": { 00:16:13.200 "read": true, 00:16:13.200 "write": true, 00:16:13.200 "unmap": true, 00:16:13.200 "write_zeroes": true, 00:16:13.200 "flush": true, 00:16:13.200 "reset": true, 00:16:13.200 "compare": false, 00:16:13.200 "compare_and_write": false, 00:16:13.200 "abort": true, 00:16:13.200 "nvme_admin": false, 00:16:13.200 "nvme_io": false 00:16:13.200 }, 00:16:13.200 "memory_domains": [ 00:16:13.200 { 00:16:13.200 "dma_device_id": "system", 00:16:13.200 "dma_device_type": 1 00:16:13.200 }, 00:16:13.200 { 00:16:13.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.200 "dma_device_type": 2 00:16:13.200 } 00:16:13.200 ], 00:16:13.200 "driver_specific": {} 00:16:13.200 }' 00:16:13.200 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.201 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.201 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:13.201 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:13.458 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:13.716 "name": "BaseBdev2", 00:16:13.716 "aliases": [ 00:16:13.716 "7cf016cc-6c5f-4dd9-8aad-92eebce772de" 00:16:13.716 ], 00:16:13.716 "product_name": "Malloc disk", 00:16:13.716 "block_size": 512, 00:16:13.716 "num_blocks": 65536, 00:16:13.716 "uuid": "7cf016cc-6c5f-4dd9-8aad-92eebce772de", 00:16:13.716 "assigned_rate_limits": { 00:16:13.716 "rw_ios_per_sec": 0, 00:16:13.716 "rw_mbytes_per_sec": 0, 00:16:13.716 "r_mbytes_per_sec": 0, 00:16:13.716 "w_mbytes_per_sec": 0 00:16:13.716 }, 00:16:13.716 "claimed": true, 00:16:13.716 "claim_type": "exclusive_write", 00:16:13.716 "zoned": false, 00:16:13.716 "supported_io_types": { 00:16:13.716 "read": true, 00:16:13.716 "write": true, 00:16:13.716 "unmap": true, 00:16:13.716 "write_zeroes": true, 00:16:13.716 "flush": true, 00:16:13.716 "reset": true, 00:16:13.716 "compare": false, 00:16:13.716 "compare_and_write": false, 00:16:13.716 "abort": true, 00:16:13.716 "nvme_admin": false, 00:16:13.716 "nvme_io": false 00:16:13.716 }, 00:16:13.716 "memory_domains": [ 00:16:13.716 { 00:16:13.716 "dma_device_id": "system", 00:16:13.716 "dma_device_type": 1 00:16:13.716 }, 00:16:13.716 { 00:16:13.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.716 "dma_device_type": 2 00:16:13.716 } 00:16:13.716 ], 00:16:13.716 "driver_specific": {} 00:16:13.716 }' 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.716 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:13.974 04:17:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:14.232 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:14.232 "name": "BaseBdev3", 00:16:14.232 "aliases": [ 00:16:14.232 "aeea2783-a9bd-4d46-ac13-2381a9f484f4" 00:16:14.232 ], 00:16:14.232 "product_name": "Malloc disk", 00:16:14.232 "block_size": 512, 00:16:14.232 "num_blocks": 65536, 00:16:14.232 "uuid": "aeea2783-a9bd-4d46-ac13-2381a9f484f4", 00:16:14.232 "assigned_rate_limits": { 00:16:14.232 "rw_ios_per_sec": 0, 00:16:14.232 "rw_mbytes_per_sec": 0, 00:16:14.232 "r_mbytes_per_sec": 0, 00:16:14.232 "w_mbytes_per_sec": 0 00:16:14.232 }, 00:16:14.232 "claimed": true, 00:16:14.232 "claim_type": "exclusive_write", 00:16:14.232 "zoned": false, 00:16:14.232 "supported_io_types": { 00:16:14.232 "read": true, 00:16:14.232 "write": true, 00:16:14.232 "unmap": true, 00:16:14.232 "write_zeroes": true, 00:16:14.232 "flush": true, 00:16:14.232 "reset": true, 00:16:14.232 "compare": false, 00:16:14.232 "compare_and_write": false, 00:16:14.232 "abort": true, 00:16:14.232 "nvme_admin": false, 00:16:14.232 "nvme_io": false 00:16:14.232 }, 00:16:14.232 "memory_domains": [ 00:16:14.232 { 00:16:14.232 "dma_device_id": "system", 00:16:14.232 "dma_device_type": 1 00:16:14.232 }, 00:16:14.232 { 00:16:14.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.232 "dma_device_type": 2 00:16:14.232 } 00:16:14.232 ], 00:16:14.232 "driver_specific": {} 00:16:14.232 }' 00:16:14.232 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:14.232 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:14.232 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:14.232 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:14.490 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:14.748 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:14.748 "name": "BaseBdev4", 00:16:14.748 "aliases": [ 00:16:14.748 "9aac17b0-178f-460d-8dc3-c876e19960bf" 00:16:14.748 ], 00:16:14.748 "product_name": "Malloc disk", 00:16:14.748 "block_size": 512, 00:16:14.748 "num_blocks": 65536, 00:16:14.748 "uuid": "9aac17b0-178f-460d-8dc3-c876e19960bf", 00:16:14.748 "assigned_rate_limits": { 00:16:14.748 "rw_ios_per_sec": 0, 00:16:14.748 "rw_mbytes_per_sec": 0, 00:16:14.748 "r_mbytes_per_sec": 0, 00:16:14.748 "w_mbytes_per_sec": 0 00:16:14.748 }, 00:16:14.748 "claimed": true, 00:16:14.748 "claim_type": "exclusive_write", 00:16:14.748 "zoned": false, 00:16:14.748 "supported_io_types": { 00:16:14.748 "read": true, 00:16:14.748 "write": true, 00:16:14.748 "unmap": true, 00:16:14.748 "write_zeroes": true, 00:16:14.748 "flush": true, 00:16:14.748 "reset": true, 00:16:14.748 "compare": false, 00:16:14.748 "compare_and_write": false, 00:16:14.748 "abort": true, 00:16:14.748 "nvme_admin": false, 00:16:14.748 "nvme_io": false 00:16:14.748 }, 00:16:14.748 "memory_domains": [ 00:16:14.748 { 00:16:14.748 "dma_device_id": "system", 00:16:14.748 "dma_device_type": 1 00:16:14.748 }, 00:16:14.748 { 00:16:14.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.748 "dma_device_type": 2 00:16:14.748 } 00:16:14.748 ], 00:16:14.748 "driver_specific": {} 00:16:14.748 }' 00:16:14.748 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:14.748 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:14.748 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:14.749 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:15.007 04:17:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:15.264 [2024-05-15 04:17:03.202942] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:15.264 [2024-05-15 04:17:03.202969] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:15.264 [2024-05-15 04:17:03.203047] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:15.264 [2024-05-15 04:17:03.203130] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:15.264 [2024-05-15 04:17:03.203153] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c04cd0 name Existed_Raid, state offline 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3885756 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3885756 ']' 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3885756 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3885756 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3885756' 00:16:15.264 killing process with pid 3885756 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3885756 00:16:15.264 [2024-05-15 04:17:03.253032] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:15.264 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3885756 00:16:15.521 [2024-05-15 04:17:03.299463] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:15.779 04:17:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:16:15.779 00:16:15.779 real 0m31.555s 00:16:15.779 user 0m58.827s 00:16:15.779 sys 0m4.353s 00:16:15.779 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:15.779 04:17:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:15.779 ************************************ 00:16:15.779 END TEST raid_state_function_test_sb 00:16:15.779 ************************************ 00:16:15.779 04:17:03 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:16:15.779 04:17:03 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:16:15.779 04:17:03 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:15.779 04:17:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:15.779 ************************************ 00:16:15.779 START TEST raid_superblock_test 00:16:15.779 ************************************ 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test concat 4 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=concat 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' concat '!=' raid1 ']' 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size=64 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@406 -- # strip_size_create_arg='-z 64' 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3890150 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3890150 /var/tmp/spdk-raid.sock 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3890150 ']' 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:15.779 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:15.779 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.779 [2024-05-15 04:17:03.670794] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:16:15.779 [2024-05-15 04:17:03.670885] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3890150 ] 00:16:15.779 [2024-05-15 04:17:03.745094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:16.037 [2024-05-15 04:17:03.852125] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:16.037 [2024-05-15 04:17:03.915456] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:16.037 [2024-05-15 04:17:03.915500] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:16.037 04:17:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:16.295 malloc1 00:16:16.295 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:16.554 [2024-05-15 04:17:04.524731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:16.554 [2024-05-15 04:17:04.524795] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.554 [2024-05-15 04:17:04.524837] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1346c20 00:16:16.554 [2024-05-15 04:17:04.524876] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.554 [2024-05-15 04:17:04.526685] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.554 [2024-05-15 04:17:04.526715] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:16.554 pt1 00:16:16.554 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:16.554 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:16.554 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:16:16.554 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:16:16.554 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:16.555 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:16.555 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:16.555 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:16.555 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:16.844 malloc2 00:16:16.844 04:17:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:17.122 [2024-05-15 04:17:05.021800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:17.122 [2024-05-15 04:17:05.021885] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.122 [2024-05-15 04:17:05.021911] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x133ec00 00:16:17.122 [2024-05-15 04:17:05.021936] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.122 [2024-05-15 04:17:05.023732] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.122 [2024-05-15 04:17:05.023757] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:17.122 pt2 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:17.122 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:17.380 malloc3 00:16:17.380 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:17.638 [2024-05-15 04:17:05.562544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:17.638 [2024-05-15 04:17:05.562602] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.638 [2024-05-15 04:17:05.562629] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ef9c0 00:16:17.638 [2024-05-15 04:17:05.562642] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.638 [2024-05-15 04:17:05.564114] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.638 [2024-05-15 04:17:05.564152] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:17.638 pt3 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:17.638 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:16:17.896 malloc4 00:16:17.896 04:17:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:18.154 [2024-05-15 04:17:06.151226] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:18.154 [2024-05-15 04:17:06.151294] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.154 [2024-05-15 04:17:06.151323] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13428e0 00:16:18.154 [2024-05-15 04:17:06.151339] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.154 [2024-05-15 04:17:06.153118] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.154 [2024-05-15 04:17:06.153148] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:18.154 pt4 00:16:18.154 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:16:18.154 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:16:18.154 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:16:18.412 [2024-05-15 04:17:06.395917] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:18.412 [2024-05-15 04:17:06.397300] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:18.412 [2024-05-15 04:17:06.397372] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:18.412 [2024-05-15 04:17:06.397430] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:18.412 [2024-05-15 04:17:06.397664] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1341cc0 00:16:18.412 [2024-05-15 04:17:06.397682] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:18.412 [2024-05-15 04:17:06.397939] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x133f190 00:16:18.412 [2024-05-15 04:17:06.398124] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1341cc0 00:16:18.412 [2024-05-15 04:17:06.398140] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1341cc0 00:16:18.412 [2024-05-15 04:17:06.398289] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.412 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:18.670 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:18.670 "name": "raid_bdev1", 00:16:18.670 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:18.670 "strip_size_kb": 64, 00:16:18.670 "state": "online", 00:16:18.670 "raid_level": "concat", 00:16:18.670 "superblock": true, 00:16:18.670 "num_base_bdevs": 4, 00:16:18.670 "num_base_bdevs_discovered": 4, 00:16:18.670 "num_base_bdevs_operational": 4, 00:16:18.670 "base_bdevs_list": [ 00:16:18.670 { 00:16:18.670 "name": "pt1", 00:16:18.670 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:18.670 "is_configured": true, 00:16:18.670 "data_offset": 2048, 00:16:18.670 "data_size": 63488 00:16:18.670 }, 00:16:18.670 { 00:16:18.670 "name": "pt2", 00:16:18.670 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:18.670 "is_configured": true, 00:16:18.670 "data_offset": 2048, 00:16:18.670 "data_size": 63488 00:16:18.670 }, 00:16:18.670 { 00:16:18.670 "name": "pt3", 00:16:18.670 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:18.670 "is_configured": true, 00:16:18.670 "data_offset": 2048, 00:16:18.670 "data_size": 63488 00:16:18.670 }, 00:16:18.670 { 00:16:18.670 "name": "pt4", 00:16:18.670 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:18.670 "is_configured": true, 00:16:18.670 "data_offset": 2048, 00:16:18.670 "data_size": 63488 00:16:18.670 } 00:16:18.670 ] 00:16:18.670 }' 00:16:18.670 04:17:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:18.670 04:17:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:19.236 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:19.493 [2024-05-15 04:17:07.418868] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:19.493 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:19.493 "name": "raid_bdev1", 00:16:19.493 "aliases": [ 00:16:19.493 "c17bbd91-d431-47ce-a9d6-16c6fe60ae76" 00:16:19.493 ], 00:16:19.493 "product_name": "Raid Volume", 00:16:19.493 "block_size": 512, 00:16:19.493 "num_blocks": 253952, 00:16:19.493 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:19.493 "assigned_rate_limits": { 00:16:19.493 "rw_ios_per_sec": 0, 00:16:19.493 "rw_mbytes_per_sec": 0, 00:16:19.493 "r_mbytes_per_sec": 0, 00:16:19.493 "w_mbytes_per_sec": 0 00:16:19.493 }, 00:16:19.493 "claimed": false, 00:16:19.493 "zoned": false, 00:16:19.493 "supported_io_types": { 00:16:19.493 "read": true, 00:16:19.493 "write": true, 00:16:19.493 "unmap": true, 00:16:19.493 "write_zeroes": true, 00:16:19.493 "flush": true, 00:16:19.493 "reset": true, 00:16:19.493 "compare": false, 00:16:19.493 "compare_and_write": false, 00:16:19.493 "abort": false, 00:16:19.493 "nvme_admin": false, 00:16:19.493 "nvme_io": false 00:16:19.493 }, 00:16:19.493 "memory_domains": [ 00:16:19.493 { 00:16:19.493 "dma_device_id": "system", 00:16:19.493 "dma_device_type": 1 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.493 "dma_device_type": 2 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "system", 00:16:19.493 "dma_device_type": 1 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.493 "dma_device_type": 2 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "system", 00:16:19.493 "dma_device_type": 1 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.493 "dma_device_type": 2 00:16:19.493 }, 00:16:19.493 { 00:16:19.493 "dma_device_id": "system", 00:16:19.493 "dma_device_type": 1 00:16:19.493 }, 00:16:19.493 { 00:16:19.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.494 "dma_device_type": 2 00:16:19.494 } 00:16:19.494 ], 00:16:19.494 "driver_specific": { 00:16:19.494 "raid": { 00:16:19.494 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:19.494 "strip_size_kb": 64, 00:16:19.494 "state": "online", 00:16:19.494 "raid_level": "concat", 00:16:19.494 "superblock": true, 00:16:19.494 "num_base_bdevs": 4, 00:16:19.494 "num_base_bdevs_discovered": 4, 00:16:19.494 "num_base_bdevs_operational": 4, 00:16:19.494 "base_bdevs_list": [ 00:16:19.494 { 00:16:19.494 "name": "pt1", 00:16:19.494 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:19.494 "is_configured": true, 00:16:19.494 "data_offset": 2048, 00:16:19.494 "data_size": 63488 00:16:19.494 }, 00:16:19.494 { 00:16:19.494 "name": "pt2", 00:16:19.494 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:19.494 "is_configured": true, 00:16:19.494 "data_offset": 2048, 00:16:19.494 "data_size": 63488 00:16:19.494 }, 00:16:19.494 { 00:16:19.494 "name": "pt3", 00:16:19.494 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:19.494 "is_configured": true, 00:16:19.494 "data_offset": 2048, 00:16:19.494 "data_size": 63488 00:16:19.494 }, 00:16:19.494 { 00:16:19.494 "name": "pt4", 00:16:19.494 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:19.494 "is_configured": true, 00:16:19.494 "data_offset": 2048, 00:16:19.494 "data_size": 63488 00:16:19.494 } 00:16:19.494 ] 00:16:19.494 } 00:16:19.494 } 00:16:19.494 }' 00:16:19.494 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:19.494 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:19.494 pt2 00:16:19.494 pt3 00:16:19.494 pt4' 00:16:19.494 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:19.494 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:19.494 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:19.752 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:19.752 "name": "pt1", 00:16:19.752 "aliases": [ 00:16:19.752 "dc7aada9-a51f-5584-aae8-9da0b43d72ee" 00:16:19.752 ], 00:16:19.752 "product_name": "passthru", 00:16:19.752 "block_size": 512, 00:16:19.752 "num_blocks": 65536, 00:16:19.752 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:19.752 "assigned_rate_limits": { 00:16:19.752 "rw_ios_per_sec": 0, 00:16:19.752 "rw_mbytes_per_sec": 0, 00:16:19.752 "r_mbytes_per_sec": 0, 00:16:19.752 "w_mbytes_per_sec": 0 00:16:19.752 }, 00:16:19.752 "claimed": true, 00:16:19.752 "claim_type": "exclusive_write", 00:16:19.752 "zoned": false, 00:16:19.752 "supported_io_types": { 00:16:19.752 "read": true, 00:16:19.752 "write": true, 00:16:19.752 "unmap": true, 00:16:19.752 "write_zeroes": true, 00:16:19.752 "flush": true, 00:16:19.752 "reset": true, 00:16:19.752 "compare": false, 00:16:19.752 "compare_and_write": false, 00:16:19.752 "abort": true, 00:16:19.752 "nvme_admin": false, 00:16:19.752 "nvme_io": false 00:16:19.752 }, 00:16:19.752 "memory_domains": [ 00:16:19.752 { 00:16:19.752 "dma_device_id": "system", 00:16:19.752 "dma_device_type": 1 00:16:19.752 }, 00:16:19.752 { 00:16:19.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:19.752 "dma_device_type": 2 00:16:19.752 } 00:16:19.752 ], 00:16:19.752 "driver_specific": { 00:16:19.752 "passthru": { 00:16:19.752 "name": "pt1", 00:16:19.752 "base_bdev_name": "malloc1" 00:16:19.752 } 00:16:19.752 } 00:16:19.752 }' 00:16:19.752 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:19.752 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:20.008 04:17:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:20.008 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:20.008 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:20.008 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:20.008 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:20.266 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:20.266 "name": "pt2", 00:16:20.266 "aliases": [ 00:16:20.266 "7281e7e1-f576-5064-8913-bc24bff9bdcd" 00:16:20.266 ], 00:16:20.266 "product_name": "passthru", 00:16:20.266 "block_size": 512, 00:16:20.266 "num_blocks": 65536, 00:16:20.266 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:20.266 "assigned_rate_limits": { 00:16:20.266 "rw_ios_per_sec": 0, 00:16:20.266 "rw_mbytes_per_sec": 0, 00:16:20.266 "r_mbytes_per_sec": 0, 00:16:20.266 "w_mbytes_per_sec": 0 00:16:20.266 }, 00:16:20.266 "claimed": true, 00:16:20.266 "claim_type": "exclusive_write", 00:16:20.266 "zoned": false, 00:16:20.266 "supported_io_types": { 00:16:20.266 "read": true, 00:16:20.266 "write": true, 00:16:20.266 "unmap": true, 00:16:20.266 "write_zeroes": true, 00:16:20.266 "flush": true, 00:16:20.266 "reset": true, 00:16:20.266 "compare": false, 00:16:20.266 "compare_and_write": false, 00:16:20.266 "abort": true, 00:16:20.266 "nvme_admin": false, 00:16:20.266 "nvme_io": false 00:16:20.266 }, 00:16:20.266 "memory_domains": [ 00:16:20.266 { 00:16:20.266 "dma_device_id": "system", 00:16:20.266 "dma_device_type": 1 00:16:20.266 }, 00:16:20.266 { 00:16:20.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:20.266 "dma_device_type": 2 00:16:20.266 } 00:16:20.266 ], 00:16:20.266 "driver_specific": { 00:16:20.266 "passthru": { 00:16:20.266 "name": "pt2", 00:16:20.266 "base_bdev_name": "malloc2" 00:16:20.266 } 00:16:20.266 } 00:16:20.266 }' 00:16:20.266 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:20.524 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:20.782 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:20.782 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:20.782 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:20.782 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:21.040 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:21.040 "name": "pt3", 00:16:21.040 "aliases": [ 00:16:21.040 "029683a3-2d24-5a3f-9a97-9cb036de46d1" 00:16:21.040 ], 00:16:21.040 "product_name": "passthru", 00:16:21.040 "block_size": 512, 00:16:21.040 "num_blocks": 65536, 00:16:21.040 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:21.040 "assigned_rate_limits": { 00:16:21.040 "rw_ios_per_sec": 0, 00:16:21.040 "rw_mbytes_per_sec": 0, 00:16:21.040 "r_mbytes_per_sec": 0, 00:16:21.040 "w_mbytes_per_sec": 0 00:16:21.040 }, 00:16:21.040 "claimed": true, 00:16:21.040 "claim_type": "exclusive_write", 00:16:21.040 "zoned": false, 00:16:21.040 "supported_io_types": { 00:16:21.040 "read": true, 00:16:21.040 "write": true, 00:16:21.040 "unmap": true, 00:16:21.040 "write_zeroes": true, 00:16:21.040 "flush": true, 00:16:21.040 "reset": true, 00:16:21.040 "compare": false, 00:16:21.040 "compare_and_write": false, 00:16:21.040 "abort": true, 00:16:21.041 "nvme_admin": false, 00:16:21.041 "nvme_io": false 00:16:21.041 }, 00:16:21.041 "memory_domains": [ 00:16:21.041 { 00:16:21.041 "dma_device_id": "system", 00:16:21.041 "dma_device_type": 1 00:16:21.041 }, 00:16:21.041 { 00:16:21.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.041 "dma_device_type": 2 00:16:21.041 } 00:16:21.041 ], 00:16:21.041 "driver_specific": { 00:16:21.041 "passthru": { 00:16:21.041 "name": "pt3", 00:16:21.041 "base_bdev_name": "malloc3" 00:16:21.041 } 00:16:21.041 } 00:16:21.041 }' 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:21.041 04:17:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:21.041 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.041 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:21.298 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:21.298 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:21.298 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:21.298 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:21.298 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:21.556 "name": "pt4", 00:16:21.556 "aliases": [ 00:16:21.556 "e1f1ebc3-60d5-5c73-b39f-0a74d972b942" 00:16:21.556 ], 00:16:21.556 "product_name": "passthru", 00:16:21.556 "block_size": 512, 00:16:21.556 "num_blocks": 65536, 00:16:21.556 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:21.556 "assigned_rate_limits": { 00:16:21.556 "rw_ios_per_sec": 0, 00:16:21.556 "rw_mbytes_per_sec": 0, 00:16:21.556 "r_mbytes_per_sec": 0, 00:16:21.556 "w_mbytes_per_sec": 0 00:16:21.556 }, 00:16:21.556 "claimed": true, 00:16:21.556 "claim_type": "exclusive_write", 00:16:21.556 "zoned": false, 00:16:21.556 "supported_io_types": { 00:16:21.556 "read": true, 00:16:21.556 "write": true, 00:16:21.556 "unmap": true, 00:16:21.556 "write_zeroes": true, 00:16:21.556 "flush": true, 00:16:21.556 "reset": true, 00:16:21.556 "compare": false, 00:16:21.556 "compare_and_write": false, 00:16:21.556 "abort": true, 00:16:21.556 "nvme_admin": false, 00:16:21.556 "nvme_io": false 00:16:21.556 }, 00:16:21.556 "memory_domains": [ 00:16:21.556 { 00:16:21.556 "dma_device_id": "system", 00:16:21.556 "dma_device_type": 1 00:16:21.556 }, 00:16:21.556 { 00:16:21.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.556 "dma_device_type": 2 00:16:21.556 } 00:16:21.556 ], 00:16:21.556 "driver_specific": { 00:16:21.556 "passthru": { 00:16:21.556 "name": "pt4", 00:16:21.556 "base_bdev_name": "malloc4" 00:16:21.556 } 00:16:21.556 } 00:16:21.556 }' 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:21.556 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:21.816 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:21.816 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:21.816 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:21.816 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:16:22.075 [2024-05-15 04:17:09.845368] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:22.075 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=c17bbd91-d431-47ce-a9d6-16c6fe60ae76 00:16:22.075 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z c17bbd91-d431-47ce-a9d6-16c6fe60ae76 ']' 00:16:22.075 04:17:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:22.333 [2024-05-15 04:17:10.109768] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:22.333 [2024-05-15 04:17:10.109793] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:22.333 [2024-05-15 04:17:10.109884] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:22.333 [2024-05-15 04:17:10.109952] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:22.333 [2024-05-15 04:17:10.109965] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1341cc0 name raid_bdev1, state offline 00:16:22.333 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.333 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:16:22.593 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:16:22.593 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:16:22.593 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:22.593 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:22.852 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:22.852 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:23.109 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:23.109 04:17:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:23.367 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:16:23.367 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:16:23.625 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:23.625 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:23.883 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:16:24.141 [2024-05-15 04:17:11.974688] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:24.141 [2024-05-15 04:17:11.976134] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:24.141 [2024-05-15 04:17:11.976185] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:24.141 [2024-05-15 04:17:11.976231] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:16:24.141 [2024-05-15 04:17:11.976298] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:24.141 [2024-05-15 04:17:11.976356] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:24.141 [2024-05-15 04:17:11.976387] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:24.141 [2024-05-15 04:17:11.976417] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:16:24.141 [2024-05-15 04:17:11.976440] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:24.141 [2024-05-15 04:17:11.976452] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1343aa0 name raid_bdev1, state configuring 00:16:24.141 request: 00:16:24.141 { 00:16:24.141 "name": "raid_bdev1", 00:16:24.141 "raid_level": "concat", 00:16:24.141 "base_bdevs": [ 00:16:24.141 "malloc1", 00:16:24.141 "malloc2", 00:16:24.141 "malloc3", 00:16:24.141 "malloc4" 00:16:24.141 ], 00:16:24.141 "superblock": false, 00:16:24.141 "strip_size_kb": 64, 00:16:24.141 "method": "bdev_raid_create", 00:16:24.141 "req_id": 1 00:16:24.141 } 00:16:24.141 Got JSON-RPC error response 00:16:24.141 response: 00:16:24.141 { 00:16:24.141 "code": -17, 00:16:24.141 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:24.141 } 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.141 04:17:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:16:24.399 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:16:24.399 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:16:24.399 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:24.657 [2024-05-15 04:17:12.467903] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:24.657 [2024-05-15 04:17:12.467972] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:24.657 [2024-05-15 04:17:12.468000] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1344a80 00:16:24.657 [2024-05-15 04:17:12.468015] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:24.657 [2024-05-15 04:17:12.469811] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:24.657 [2024-05-15 04:17:12.469849] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:24.657 [2024-05-15 04:17:12.469942] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:24.657 [2024-05-15 04:17:12.469990] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:24.657 pt1 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.657 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:24.915 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:24.915 "name": "raid_bdev1", 00:16:24.915 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:24.915 "strip_size_kb": 64, 00:16:24.915 "state": "configuring", 00:16:24.915 "raid_level": "concat", 00:16:24.915 "superblock": true, 00:16:24.915 "num_base_bdevs": 4, 00:16:24.915 "num_base_bdevs_discovered": 1, 00:16:24.915 "num_base_bdevs_operational": 4, 00:16:24.915 "base_bdevs_list": [ 00:16:24.915 { 00:16:24.915 "name": "pt1", 00:16:24.915 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:24.915 "is_configured": true, 00:16:24.915 "data_offset": 2048, 00:16:24.915 "data_size": 63488 00:16:24.915 }, 00:16:24.915 { 00:16:24.915 "name": null, 00:16:24.915 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:24.915 "is_configured": false, 00:16:24.915 "data_offset": 2048, 00:16:24.915 "data_size": 63488 00:16:24.915 }, 00:16:24.915 { 00:16:24.915 "name": null, 00:16:24.915 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:24.915 "is_configured": false, 00:16:24.915 "data_offset": 2048, 00:16:24.915 "data_size": 63488 00:16:24.915 }, 00:16:24.915 { 00:16:24.915 "name": null, 00:16:24.915 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:24.915 "is_configured": false, 00:16:24.915 "data_offset": 2048, 00:16:24.915 "data_size": 63488 00:16:24.915 } 00:16:24.915 ] 00:16:24.915 }' 00:16:24.915 04:17:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:24.915 04:17:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.481 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:16:25.481 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:25.737 [2024-05-15 04:17:13.542792] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:25.737 [2024-05-15 04:17:13.542882] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.737 [2024-05-15 04:17:13.542908] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13417f0 00:16:25.737 [2024-05-15 04:17:13.542921] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.737 [2024-05-15 04:17:13.543291] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.737 [2024-05-15 04:17:13.543313] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:25.737 [2024-05-15 04:17:13.543387] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:25.737 [2024-05-15 04:17:13.543412] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:25.737 pt2 00:16:25.737 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:25.994 [2024-05-15 04:17:13.787448] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.994 04:17:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:26.251 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:26.251 "name": "raid_bdev1", 00:16:26.251 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:26.251 "strip_size_kb": 64, 00:16:26.251 "state": "configuring", 00:16:26.251 "raid_level": "concat", 00:16:26.251 "superblock": true, 00:16:26.251 "num_base_bdevs": 4, 00:16:26.251 "num_base_bdevs_discovered": 1, 00:16:26.251 "num_base_bdevs_operational": 4, 00:16:26.251 "base_bdevs_list": [ 00:16:26.251 { 00:16:26.251 "name": "pt1", 00:16:26.251 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:26.251 "is_configured": true, 00:16:26.251 "data_offset": 2048, 00:16:26.251 "data_size": 63488 00:16:26.251 }, 00:16:26.251 { 00:16:26.251 "name": null, 00:16:26.252 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:26.252 "is_configured": false, 00:16:26.252 "data_offset": 2048, 00:16:26.252 "data_size": 63488 00:16:26.252 }, 00:16:26.252 { 00:16:26.252 "name": null, 00:16:26.252 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:26.252 "is_configured": false, 00:16:26.252 "data_offset": 2048, 00:16:26.252 "data_size": 63488 00:16:26.252 }, 00:16:26.252 { 00:16:26.252 "name": null, 00:16:26.252 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:26.252 "is_configured": false, 00:16:26.252 "data_offset": 2048, 00:16:26.252 "data_size": 63488 00:16:26.252 } 00:16:26.252 ] 00:16:26.252 }' 00:16:26.252 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:26.252 04:17:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:26.816 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:16:26.816 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:26.816 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:26.816 [2024-05-15 04:17:14.822217] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:26.816 [2024-05-15 04:17:14.822287] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.816 [2024-05-15 04:17:14.822314] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1340570 00:16:26.816 [2024-05-15 04:17:14.822329] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.816 [2024-05-15 04:17:14.822719] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.816 [2024-05-15 04:17:14.822746] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:26.816 [2024-05-15 04:17:14.822841] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:26.816 [2024-05-15 04:17:14.822873] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:26.816 pt2 00:16:27.074 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:27.074 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:27.074 04:17:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:27.074 [2024-05-15 04:17:15.054815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:27.074 [2024-05-15 04:17:15.054870] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.074 [2024-05-15 04:17:15.054892] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1344f00 00:16:27.074 [2024-05-15 04:17:15.054906] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.074 [2024-05-15 04:17:15.055179] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.074 [2024-05-15 04:17:15.055206] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:27.074 [2024-05-15 04:17:15.055270] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:27.074 [2024-05-15 04:17:15.055296] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:27.074 pt3 00:16:27.074 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:27.074 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:27.074 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:16:27.332 [2024-05-15 04:17:15.287442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:16:27.332 [2024-05-15 04:17:15.287488] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.332 [2024-05-15 04:17:15.287509] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1347b90 00:16:27.332 [2024-05-15 04:17:15.287523] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.332 [2024-05-15 04:17:15.287798] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.332 [2024-05-15 04:17:15.287844] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:16:27.332 [2024-05-15 04:17:15.287903] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:16:27.332 [2024-05-15 04:17:15.287929] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:16:27.332 [2024-05-15 04:17:15.288059] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1342f10 00:16:27.332 [2024-05-15 04:17:15.288076] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:16:27.332 [2024-05-15 04:17:15.288254] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x133e240 00:16:27.332 [2024-05-15 04:17:15.288407] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1342f10 00:16:27.332 [2024-05-15 04:17:15.288423] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1342f10 00:16:27.332 [2024-05-15 04:17:15.288530] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:27.332 pt4 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=concat 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=64 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.332 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:27.590 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:27.590 "name": "raid_bdev1", 00:16:27.590 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:27.590 "strip_size_kb": 64, 00:16:27.590 "state": "online", 00:16:27.590 "raid_level": "concat", 00:16:27.590 "superblock": true, 00:16:27.590 "num_base_bdevs": 4, 00:16:27.590 "num_base_bdevs_discovered": 4, 00:16:27.590 "num_base_bdevs_operational": 4, 00:16:27.590 "base_bdevs_list": [ 00:16:27.590 { 00:16:27.590 "name": "pt1", 00:16:27.590 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:27.590 "is_configured": true, 00:16:27.590 "data_offset": 2048, 00:16:27.590 "data_size": 63488 00:16:27.590 }, 00:16:27.590 { 00:16:27.590 "name": "pt2", 00:16:27.590 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:27.590 "is_configured": true, 00:16:27.590 "data_offset": 2048, 00:16:27.590 "data_size": 63488 00:16:27.590 }, 00:16:27.590 { 00:16:27.590 "name": "pt3", 00:16:27.590 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:27.590 "is_configured": true, 00:16:27.590 "data_offset": 2048, 00:16:27.590 "data_size": 63488 00:16:27.590 }, 00:16:27.590 { 00:16:27.590 "name": "pt4", 00:16:27.590 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:27.590 "is_configured": true, 00:16:27.590 "data_offset": 2048, 00:16:27.590 "data_size": 63488 00:16:27.590 } 00:16:27.590 ] 00:16:27.590 }' 00:16:27.590 04:17:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:27.590 04:17:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:28.156 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:28.414 [2024-05-15 04:17:16.318465] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:28.414 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:28.414 "name": "raid_bdev1", 00:16:28.414 "aliases": [ 00:16:28.414 "c17bbd91-d431-47ce-a9d6-16c6fe60ae76" 00:16:28.414 ], 00:16:28.414 "product_name": "Raid Volume", 00:16:28.414 "block_size": 512, 00:16:28.414 "num_blocks": 253952, 00:16:28.414 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:28.414 "assigned_rate_limits": { 00:16:28.414 "rw_ios_per_sec": 0, 00:16:28.414 "rw_mbytes_per_sec": 0, 00:16:28.414 "r_mbytes_per_sec": 0, 00:16:28.414 "w_mbytes_per_sec": 0 00:16:28.414 }, 00:16:28.414 "claimed": false, 00:16:28.414 "zoned": false, 00:16:28.414 "supported_io_types": { 00:16:28.414 "read": true, 00:16:28.414 "write": true, 00:16:28.414 "unmap": true, 00:16:28.414 "write_zeroes": true, 00:16:28.414 "flush": true, 00:16:28.414 "reset": true, 00:16:28.414 "compare": false, 00:16:28.414 "compare_and_write": false, 00:16:28.414 "abort": false, 00:16:28.414 "nvme_admin": false, 00:16:28.414 "nvme_io": false 00:16:28.414 }, 00:16:28.414 "memory_domains": [ 00:16:28.414 { 00:16:28.414 "dma_device_id": "system", 00:16:28.414 "dma_device_type": 1 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.414 "dma_device_type": 2 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "system", 00:16:28.414 "dma_device_type": 1 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.414 "dma_device_type": 2 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "system", 00:16:28.414 "dma_device_type": 1 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.414 "dma_device_type": 2 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "system", 00:16:28.414 "dma_device_type": 1 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.414 "dma_device_type": 2 00:16:28.414 } 00:16:28.414 ], 00:16:28.414 "driver_specific": { 00:16:28.414 "raid": { 00:16:28.414 "uuid": "c17bbd91-d431-47ce-a9d6-16c6fe60ae76", 00:16:28.414 "strip_size_kb": 64, 00:16:28.414 "state": "online", 00:16:28.414 "raid_level": "concat", 00:16:28.414 "superblock": true, 00:16:28.414 "num_base_bdevs": 4, 00:16:28.414 "num_base_bdevs_discovered": 4, 00:16:28.414 "num_base_bdevs_operational": 4, 00:16:28.414 "base_bdevs_list": [ 00:16:28.414 { 00:16:28.414 "name": "pt1", 00:16:28.414 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:28.414 "is_configured": true, 00:16:28.414 "data_offset": 2048, 00:16:28.414 "data_size": 63488 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "name": "pt2", 00:16:28.414 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:28.414 "is_configured": true, 00:16:28.414 "data_offset": 2048, 00:16:28.414 "data_size": 63488 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "name": "pt3", 00:16:28.414 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:28.414 "is_configured": true, 00:16:28.414 "data_offset": 2048, 00:16:28.414 "data_size": 63488 00:16:28.414 }, 00:16:28.414 { 00:16:28.414 "name": "pt4", 00:16:28.414 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:28.414 "is_configured": true, 00:16:28.414 "data_offset": 2048, 00:16:28.414 "data_size": 63488 00:16:28.414 } 00:16:28.414 ] 00:16:28.414 } 00:16:28.414 } 00:16:28.415 }' 00:16:28.415 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:28.415 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:16:28.415 pt2 00:16:28.415 pt3 00:16:28.415 pt4' 00:16:28.415 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:28.415 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:28.415 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:28.673 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:28.673 "name": "pt1", 00:16:28.673 "aliases": [ 00:16:28.673 "dc7aada9-a51f-5584-aae8-9da0b43d72ee" 00:16:28.673 ], 00:16:28.673 "product_name": "passthru", 00:16:28.673 "block_size": 512, 00:16:28.673 "num_blocks": 65536, 00:16:28.673 "uuid": "dc7aada9-a51f-5584-aae8-9da0b43d72ee", 00:16:28.673 "assigned_rate_limits": { 00:16:28.673 "rw_ios_per_sec": 0, 00:16:28.673 "rw_mbytes_per_sec": 0, 00:16:28.673 "r_mbytes_per_sec": 0, 00:16:28.673 "w_mbytes_per_sec": 0 00:16:28.673 }, 00:16:28.673 "claimed": true, 00:16:28.673 "claim_type": "exclusive_write", 00:16:28.673 "zoned": false, 00:16:28.673 "supported_io_types": { 00:16:28.673 "read": true, 00:16:28.673 "write": true, 00:16:28.673 "unmap": true, 00:16:28.673 "write_zeroes": true, 00:16:28.673 "flush": true, 00:16:28.673 "reset": true, 00:16:28.673 "compare": false, 00:16:28.673 "compare_and_write": false, 00:16:28.673 "abort": true, 00:16:28.673 "nvme_admin": false, 00:16:28.673 "nvme_io": false 00:16:28.673 }, 00:16:28.673 "memory_domains": [ 00:16:28.673 { 00:16:28.673 "dma_device_id": "system", 00:16:28.673 "dma_device_type": 1 00:16:28.673 }, 00:16:28.673 { 00:16:28.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:28.673 "dma_device_type": 2 00:16:28.673 } 00:16:28.673 ], 00:16:28.673 "driver_specific": { 00:16:28.673 "passthru": { 00:16:28.673 "name": "pt1", 00:16:28.673 "base_bdev_name": "malloc1" 00:16:28.673 } 00:16:28.673 } 00:16:28.673 }' 00:16:28.673 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:28.673 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:28.673 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:28.673 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:28.931 04:17:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:29.188 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:29.188 "name": "pt2", 00:16:29.188 "aliases": [ 00:16:29.188 "7281e7e1-f576-5064-8913-bc24bff9bdcd" 00:16:29.188 ], 00:16:29.188 "product_name": "passthru", 00:16:29.188 "block_size": 512, 00:16:29.188 "num_blocks": 65536, 00:16:29.188 "uuid": "7281e7e1-f576-5064-8913-bc24bff9bdcd", 00:16:29.188 "assigned_rate_limits": { 00:16:29.188 "rw_ios_per_sec": 0, 00:16:29.188 "rw_mbytes_per_sec": 0, 00:16:29.188 "r_mbytes_per_sec": 0, 00:16:29.188 "w_mbytes_per_sec": 0 00:16:29.188 }, 00:16:29.188 "claimed": true, 00:16:29.188 "claim_type": "exclusive_write", 00:16:29.188 "zoned": false, 00:16:29.188 "supported_io_types": { 00:16:29.188 "read": true, 00:16:29.188 "write": true, 00:16:29.188 "unmap": true, 00:16:29.188 "write_zeroes": true, 00:16:29.188 "flush": true, 00:16:29.188 "reset": true, 00:16:29.188 "compare": false, 00:16:29.188 "compare_and_write": false, 00:16:29.188 "abort": true, 00:16:29.188 "nvme_admin": false, 00:16:29.188 "nvme_io": false 00:16:29.188 }, 00:16:29.188 "memory_domains": [ 00:16:29.188 { 00:16:29.188 "dma_device_id": "system", 00:16:29.188 "dma_device_type": 1 00:16:29.188 }, 00:16:29.188 { 00:16:29.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.188 "dma_device_type": 2 00:16:29.188 } 00:16:29.188 ], 00:16:29.188 "driver_specific": { 00:16:29.188 "passthru": { 00:16:29.188 "name": "pt2", 00:16:29.188 "base_bdev_name": "malloc2" 00:16:29.188 } 00:16:29.188 } 00:16:29.188 }' 00:16:29.188 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:29.188 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:29.446 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:29.704 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:29.704 "name": "pt3", 00:16:29.704 "aliases": [ 00:16:29.704 "029683a3-2d24-5a3f-9a97-9cb036de46d1" 00:16:29.704 ], 00:16:29.704 "product_name": "passthru", 00:16:29.704 "block_size": 512, 00:16:29.704 "num_blocks": 65536, 00:16:29.704 "uuid": "029683a3-2d24-5a3f-9a97-9cb036de46d1", 00:16:29.704 "assigned_rate_limits": { 00:16:29.704 "rw_ios_per_sec": 0, 00:16:29.704 "rw_mbytes_per_sec": 0, 00:16:29.704 "r_mbytes_per_sec": 0, 00:16:29.704 "w_mbytes_per_sec": 0 00:16:29.704 }, 00:16:29.704 "claimed": true, 00:16:29.704 "claim_type": "exclusive_write", 00:16:29.704 "zoned": false, 00:16:29.704 "supported_io_types": { 00:16:29.704 "read": true, 00:16:29.704 "write": true, 00:16:29.704 "unmap": true, 00:16:29.704 "write_zeroes": true, 00:16:29.704 "flush": true, 00:16:29.704 "reset": true, 00:16:29.704 "compare": false, 00:16:29.704 "compare_and_write": false, 00:16:29.704 "abort": true, 00:16:29.704 "nvme_admin": false, 00:16:29.704 "nvme_io": false 00:16:29.704 }, 00:16:29.704 "memory_domains": [ 00:16:29.704 { 00:16:29.704 "dma_device_id": "system", 00:16:29.704 "dma_device_type": 1 00:16:29.704 }, 00:16:29.704 { 00:16:29.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.704 "dma_device_type": 2 00:16:29.704 } 00:16:29.704 ], 00:16:29.704 "driver_specific": { 00:16:29.704 "passthru": { 00:16:29.704 "name": "pt3", 00:16:29.704 "base_bdev_name": "malloc3" 00:16:29.704 } 00:16:29.704 } 00:16:29.704 }' 00:16:29.704 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:29.962 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:30.220 04:17:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:30.220 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:30.220 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:30.220 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:16:30.220 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:30.478 "name": "pt4", 00:16:30.478 "aliases": [ 00:16:30.478 "e1f1ebc3-60d5-5c73-b39f-0a74d972b942" 00:16:30.478 ], 00:16:30.478 "product_name": "passthru", 00:16:30.478 "block_size": 512, 00:16:30.478 "num_blocks": 65536, 00:16:30.478 "uuid": "e1f1ebc3-60d5-5c73-b39f-0a74d972b942", 00:16:30.478 "assigned_rate_limits": { 00:16:30.478 "rw_ios_per_sec": 0, 00:16:30.478 "rw_mbytes_per_sec": 0, 00:16:30.478 "r_mbytes_per_sec": 0, 00:16:30.478 "w_mbytes_per_sec": 0 00:16:30.478 }, 00:16:30.478 "claimed": true, 00:16:30.478 "claim_type": "exclusive_write", 00:16:30.478 "zoned": false, 00:16:30.478 "supported_io_types": { 00:16:30.478 "read": true, 00:16:30.478 "write": true, 00:16:30.478 "unmap": true, 00:16:30.478 "write_zeroes": true, 00:16:30.478 "flush": true, 00:16:30.478 "reset": true, 00:16:30.478 "compare": false, 00:16:30.478 "compare_and_write": false, 00:16:30.478 "abort": true, 00:16:30.478 "nvme_admin": false, 00:16:30.478 "nvme_io": false 00:16:30.478 }, 00:16:30.478 "memory_domains": [ 00:16:30.478 { 00:16:30.478 "dma_device_id": "system", 00:16:30.478 "dma_device_type": 1 00:16:30.478 }, 00:16:30.478 { 00:16:30.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.478 "dma_device_type": 2 00:16:30.478 } 00:16:30.478 ], 00:16:30.478 "driver_specific": { 00:16:30.478 "passthru": { 00:16:30.478 "name": "pt4", 00:16:30.478 "base_bdev_name": "malloc4" 00:16:30.478 } 00:16:30.478 } 00:16:30.478 }' 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:30.478 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:30.736 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:30.736 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:30.736 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:30.736 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:16:30.995 [2024-05-15 04:17:18.785288] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' c17bbd91-d431-47ce-a9d6-16c6fe60ae76 '!=' c17bbd91-d431-47ce-a9d6-16c6fe60ae76 ']' 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy concat 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@216 -- # return 1 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3890150 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3890150 ']' 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3890150 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3890150 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3890150' 00:16:30.995 killing process with pid 3890150 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3890150 00:16:30.995 [2024-05-15 04:17:18.834064] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:30.995 04:17:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3890150 00:16:30.995 [2024-05-15 04:17:18.834172] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:30.995 [2024-05-15 04:17:18.834257] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:30.995 [2024-05-15 04:17:18.834274] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1342f10 name raid_bdev1, state offline 00:16:30.995 [2024-05-15 04:17:18.882334] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:31.253 04:17:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:16:31.253 00:16:31.253 real 0m15.524s 00:16:31.253 user 0m28.843s 00:16:31.253 sys 0m2.180s 00:16:31.253 04:17:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:16:31.253 04:17:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.253 ************************************ 00:16:31.253 END TEST raid_superblock_test 00:16:31.253 ************************************ 00:16:31.253 04:17:19 bdev_raid -- bdev/bdev_raid.sh@802 -- # for level in raid0 concat raid1 00:16:31.253 04:17:19 bdev_raid -- bdev/bdev_raid.sh@803 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:16:31.253 04:17:19 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:16:31.253 04:17:19 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:16:31.253 04:17:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:31.253 ************************************ 00:16:31.253 START TEST raid_state_function_test 00:16:31.253 ************************************ 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 false 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local superblock=false 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:16:31.253 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@238 -- # '[' false = true ']' 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@241 -- # superblock_create_arg= 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # raid_pid=3892335 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3892335' 00:16:31.254 Process raid pid: 3892335 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@247 -- # waitforlisten 3892335 /var/tmp/spdk-raid.sock 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@827 -- # '[' -z 3892335 ']' 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:31.254 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:16:31.254 04:17:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.254 [2024-05-15 04:17:19.249382] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:16:31.254 [2024-05-15 04:17:19.249453] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:31.512 [2024-05-15 04:17:19.329055] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:31.512 [2024-05-15 04:17:19.445433] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:16:31.512 [2024-05-15 04:17:19.517094] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.512 [2024-05-15 04:17:19.517148] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:32.446 04:17:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:16:32.447 04:17:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # return 0 00:16:32.447 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:32.447 [2024-05-15 04:17:20.448896] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:32.447 [2024-05-15 04:17:20.448938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:32.447 [2024-05-15 04:17:20.448951] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:32.447 [2024-05-15 04:17:20.448962] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:32.447 [2024-05-15 04:17:20.448970] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:32.447 [2024-05-15 04:17:20.448980] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:32.447 [2024-05-15 04:17:20.448987] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:32.447 [2024-05-15 04:17:20.449007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.704 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:32.704 "name": "Existed_Raid", 00:16:32.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.704 "strip_size_kb": 0, 00:16:32.704 "state": "configuring", 00:16:32.704 "raid_level": "raid1", 00:16:32.704 "superblock": false, 00:16:32.704 "num_base_bdevs": 4, 00:16:32.704 "num_base_bdevs_discovered": 0, 00:16:32.704 "num_base_bdevs_operational": 4, 00:16:32.704 "base_bdevs_list": [ 00:16:32.704 { 00:16:32.705 "name": "BaseBdev1", 00:16:32.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.705 "is_configured": false, 00:16:32.705 "data_offset": 0, 00:16:32.705 "data_size": 0 00:16:32.705 }, 00:16:32.705 { 00:16:32.705 "name": "BaseBdev2", 00:16:32.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.705 "is_configured": false, 00:16:32.705 "data_offset": 0, 00:16:32.705 "data_size": 0 00:16:32.705 }, 00:16:32.705 { 00:16:32.705 "name": "BaseBdev3", 00:16:32.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.705 "is_configured": false, 00:16:32.705 "data_offset": 0, 00:16:32.705 "data_size": 0 00:16:32.705 }, 00:16:32.705 { 00:16:32.705 "name": "BaseBdev4", 00:16:32.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.705 "is_configured": false, 00:16:32.705 "data_offset": 0, 00:16:32.705 "data_size": 0 00:16:32.705 } 00:16:32.705 ] 00:16:32.705 }' 00:16:32.705 04:17:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:32.705 04:17:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.271 04:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:33.528 [2024-05-15 04:17:21.463490] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:33.528 [2024-05-15 04:17:21.463525] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b5c040 name Existed_Raid, state configuring 00:16:33.528 04:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:33.784 [2024-05-15 04:17:21.696110] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:33.784 [2024-05-15 04:17:21.696149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:33.784 [2024-05-15 04:17:21.696170] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:33.784 [2024-05-15 04:17:21.696182] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:33.784 [2024-05-15 04:17:21.696192] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:33.784 [2024-05-15 04:17:21.696204] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:33.784 [2024-05-15 04:17:21.696214] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:33.784 [2024-05-15 04:17:21.696226] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:33.784 04:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:34.042 [2024-05-15 04:17:21.945196] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:34.042 BaseBdev1 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:34.042 04:17:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.300 04:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:34.559 [ 00:16:34.559 { 00:16:34.559 "name": "BaseBdev1", 00:16:34.559 "aliases": [ 00:16:34.559 "ced0cd18-25cb-4fc8-979f-0868b37f085e" 00:16:34.559 ], 00:16:34.559 "product_name": "Malloc disk", 00:16:34.559 "block_size": 512, 00:16:34.559 "num_blocks": 65536, 00:16:34.559 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:34.559 "assigned_rate_limits": { 00:16:34.559 "rw_ios_per_sec": 0, 00:16:34.559 "rw_mbytes_per_sec": 0, 00:16:34.559 "r_mbytes_per_sec": 0, 00:16:34.559 "w_mbytes_per_sec": 0 00:16:34.559 }, 00:16:34.559 "claimed": true, 00:16:34.559 "claim_type": "exclusive_write", 00:16:34.559 "zoned": false, 00:16:34.559 "supported_io_types": { 00:16:34.559 "read": true, 00:16:34.559 "write": true, 00:16:34.559 "unmap": true, 00:16:34.559 "write_zeroes": true, 00:16:34.559 "flush": true, 00:16:34.559 "reset": true, 00:16:34.559 "compare": false, 00:16:34.559 "compare_and_write": false, 00:16:34.559 "abort": true, 00:16:34.559 "nvme_admin": false, 00:16:34.559 "nvme_io": false 00:16:34.559 }, 00:16:34.559 "memory_domains": [ 00:16:34.559 { 00:16:34.559 "dma_device_id": "system", 00:16:34.559 "dma_device_type": 1 00:16:34.559 }, 00:16:34.559 { 00:16:34.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.559 "dma_device_type": 2 00:16:34.559 } 00:16:34.559 ], 00:16:34.559 "driver_specific": {} 00:16:34.559 } 00:16:34.559 ] 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.559 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.817 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:34.817 "name": "Existed_Raid", 00:16:34.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.817 "strip_size_kb": 0, 00:16:34.817 "state": "configuring", 00:16:34.817 "raid_level": "raid1", 00:16:34.817 "superblock": false, 00:16:34.817 "num_base_bdevs": 4, 00:16:34.817 "num_base_bdevs_discovered": 1, 00:16:34.817 "num_base_bdevs_operational": 4, 00:16:34.817 "base_bdevs_list": [ 00:16:34.817 { 00:16:34.817 "name": "BaseBdev1", 00:16:34.817 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:34.817 "is_configured": true, 00:16:34.817 "data_offset": 0, 00:16:34.817 "data_size": 65536 00:16:34.817 }, 00:16:34.817 { 00:16:34.817 "name": "BaseBdev2", 00:16:34.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.817 "is_configured": false, 00:16:34.817 "data_offset": 0, 00:16:34.817 "data_size": 0 00:16:34.817 }, 00:16:34.817 { 00:16:34.817 "name": "BaseBdev3", 00:16:34.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.817 "is_configured": false, 00:16:34.817 "data_offset": 0, 00:16:34.817 "data_size": 0 00:16:34.817 }, 00:16:34.817 { 00:16:34.817 "name": "BaseBdev4", 00:16:34.817 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.817 "is_configured": false, 00:16:34.817 "data_offset": 0, 00:16:34.817 "data_size": 0 00:16:34.817 } 00:16:34.817 ] 00:16:34.817 }' 00:16:34.817 04:17:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:34.817 04:17:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.383 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:35.641 [2024-05-15 04:17:23.573549] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:35.641 [2024-05-15 04:17:23.573600] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b5b8b0 name Existed_Raid, state configuring 00:16:35.641 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:35.898 [2024-05-15 04:17:23.866341] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:35.898 [2024-05-15 04:17:23.867858] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:35.898 [2024-05-15 04:17:23.867894] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:35.898 [2024-05-15 04:17:23.867907] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:35.898 [2024-05-15 04:17:23.867920] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:35.898 [2024-05-15 04:17:23.867929] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:35.898 [2024-05-15 04:17:23.867942] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.898 04:17:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:36.156 04:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:36.156 "name": "Existed_Raid", 00:16:36.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.156 "strip_size_kb": 0, 00:16:36.156 "state": "configuring", 00:16:36.156 "raid_level": "raid1", 00:16:36.156 "superblock": false, 00:16:36.156 "num_base_bdevs": 4, 00:16:36.156 "num_base_bdevs_discovered": 1, 00:16:36.156 "num_base_bdevs_operational": 4, 00:16:36.156 "base_bdevs_list": [ 00:16:36.156 { 00:16:36.156 "name": "BaseBdev1", 00:16:36.156 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:36.156 "is_configured": true, 00:16:36.156 "data_offset": 0, 00:16:36.156 "data_size": 65536 00:16:36.156 }, 00:16:36.156 { 00:16:36.156 "name": "BaseBdev2", 00:16:36.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.156 "is_configured": false, 00:16:36.156 "data_offset": 0, 00:16:36.156 "data_size": 0 00:16:36.156 }, 00:16:36.156 { 00:16:36.156 "name": "BaseBdev3", 00:16:36.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.156 "is_configured": false, 00:16:36.156 "data_offset": 0, 00:16:36.156 "data_size": 0 00:16:36.156 }, 00:16:36.156 { 00:16:36.156 "name": "BaseBdev4", 00:16:36.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:36.156 "is_configured": false, 00:16:36.156 "data_offset": 0, 00:16:36.156 "data_size": 0 00:16:36.156 } 00:16:36.156 ] 00:16:36.156 }' 00:16:36.156 04:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:36.156 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.721 04:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:36.980 [2024-05-15 04:17:24.894885] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:36.980 BaseBdev2 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:36.980 04:17:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.283 04:17:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:37.562 [ 00:16:37.562 { 00:16:37.562 "name": "BaseBdev2", 00:16:37.562 "aliases": [ 00:16:37.562 "f975b1eb-432c-4410-b5ca-d303b1a5e775" 00:16:37.562 ], 00:16:37.562 "product_name": "Malloc disk", 00:16:37.562 "block_size": 512, 00:16:37.562 "num_blocks": 65536, 00:16:37.562 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:37.562 "assigned_rate_limits": { 00:16:37.562 "rw_ios_per_sec": 0, 00:16:37.562 "rw_mbytes_per_sec": 0, 00:16:37.562 "r_mbytes_per_sec": 0, 00:16:37.562 "w_mbytes_per_sec": 0 00:16:37.562 }, 00:16:37.562 "claimed": true, 00:16:37.562 "claim_type": "exclusive_write", 00:16:37.562 "zoned": false, 00:16:37.562 "supported_io_types": { 00:16:37.562 "read": true, 00:16:37.562 "write": true, 00:16:37.562 "unmap": true, 00:16:37.562 "write_zeroes": true, 00:16:37.562 "flush": true, 00:16:37.562 "reset": true, 00:16:37.562 "compare": false, 00:16:37.562 "compare_and_write": false, 00:16:37.562 "abort": true, 00:16:37.562 "nvme_admin": false, 00:16:37.562 "nvme_io": false 00:16:37.562 }, 00:16:37.562 "memory_domains": [ 00:16:37.562 { 00:16:37.562 "dma_device_id": "system", 00:16:37.562 "dma_device_type": 1 00:16:37.562 }, 00:16:37.562 { 00:16:37.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.562 "dma_device_type": 2 00:16:37.562 } 00:16:37.562 ], 00:16:37.562 "driver_specific": {} 00:16:37.562 } 00:16:37.562 ] 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.562 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.821 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:37.821 "name": "Existed_Raid", 00:16:37.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.821 "strip_size_kb": 0, 00:16:37.821 "state": "configuring", 00:16:37.821 "raid_level": "raid1", 00:16:37.821 "superblock": false, 00:16:37.821 "num_base_bdevs": 4, 00:16:37.821 "num_base_bdevs_discovered": 2, 00:16:37.821 "num_base_bdevs_operational": 4, 00:16:37.821 "base_bdevs_list": [ 00:16:37.821 { 00:16:37.821 "name": "BaseBdev1", 00:16:37.821 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:37.821 "is_configured": true, 00:16:37.821 "data_offset": 0, 00:16:37.821 "data_size": 65536 00:16:37.821 }, 00:16:37.821 { 00:16:37.821 "name": "BaseBdev2", 00:16:37.821 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:37.821 "is_configured": true, 00:16:37.822 "data_offset": 0, 00:16:37.822 "data_size": 65536 00:16:37.822 }, 00:16:37.822 { 00:16:37.822 "name": "BaseBdev3", 00:16:37.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.822 "is_configured": false, 00:16:37.822 "data_offset": 0, 00:16:37.822 "data_size": 0 00:16:37.822 }, 00:16:37.822 { 00:16:37.822 "name": "BaseBdev4", 00:16:37.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.822 "is_configured": false, 00:16:37.822 "data_offset": 0, 00:16:37.822 "data_size": 0 00:16:37.822 } 00:16:37.822 ] 00:16:37.822 }' 00:16:37.822 04:17:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:37.822 04:17:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.387 04:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:38.646 [2024-05-15 04:17:26.489483] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:38.646 BaseBdev3 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:38.646 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.903 04:17:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:39.162 [ 00:16:39.162 { 00:16:39.162 "name": "BaseBdev3", 00:16:39.162 "aliases": [ 00:16:39.162 "d282325b-ab85-4663-951c-37d7fdeb88e5" 00:16:39.162 ], 00:16:39.162 "product_name": "Malloc disk", 00:16:39.162 "block_size": 512, 00:16:39.162 "num_blocks": 65536, 00:16:39.162 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:39.162 "assigned_rate_limits": { 00:16:39.162 "rw_ios_per_sec": 0, 00:16:39.162 "rw_mbytes_per_sec": 0, 00:16:39.162 "r_mbytes_per_sec": 0, 00:16:39.162 "w_mbytes_per_sec": 0 00:16:39.162 }, 00:16:39.162 "claimed": true, 00:16:39.162 "claim_type": "exclusive_write", 00:16:39.162 "zoned": false, 00:16:39.162 "supported_io_types": { 00:16:39.162 "read": true, 00:16:39.162 "write": true, 00:16:39.162 "unmap": true, 00:16:39.162 "write_zeroes": true, 00:16:39.162 "flush": true, 00:16:39.162 "reset": true, 00:16:39.162 "compare": false, 00:16:39.162 "compare_and_write": false, 00:16:39.162 "abort": true, 00:16:39.162 "nvme_admin": false, 00:16:39.162 "nvme_io": false 00:16:39.162 }, 00:16:39.162 "memory_domains": [ 00:16:39.162 { 00:16:39.162 "dma_device_id": "system", 00:16:39.162 "dma_device_type": 1 00:16:39.162 }, 00:16:39.162 { 00:16:39.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.162 "dma_device_type": 2 00:16:39.162 } 00:16:39.162 ], 00:16:39.162 "driver_specific": {} 00:16:39.162 } 00:16:39.162 ] 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.162 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.420 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:39.420 "name": "Existed_Raid", 00:16:39.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.420 "strip_size_kb": 0, 00:16:39.420 "state": "configuring", 00:16:39.420 "raid_level": "raid1", 00:16:39.420 "superblock": false, 00:16:39.420 "num_base_bdevs": 4, 00:16:39.420 "num_base_bdevs_discovered": 3, 00:16:39.420 "num_base_bdevs_operational": 4, 00:16:39.420 "base_bdevs_list": [ 00:16:39.420 { 00:16:39.420 "name": "BaseBdev1", 00:16:39.420 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:39.420 "is_configured": true, 00:16:39.420 "data_offset": 0, 00:16:39.420 "data_size": 65536 00:16:39.420 }, 00:16:39.420 { 00:16:39.420 "name": "BaseBdev2", 00:16:39.420 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:39.420 "is_configured": true, 00:16:39.420 "data_offset": 0, 00:16:39.420 "data_size": 65536 00:16:39.420 }, 00:16:39.420 { 00:16:39.420 "name": "BaseBdev3", 00:16:39.420 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:39.420 "is_configured": true, 00:16:39.420 "data_offset": 0, 00:16:39.420 "data_size": 65536 00:16:39.420 }, 00:16:39.420 { 00:16:39.420 "name": "BaseBdev4", 00:16:39.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.420 "is_configured": false, 00:16:39.420 "data_offset": 0, 00:16:39.420 "data_size": 0 00:16:39.420 } 00:16:39.420 ] 00:16:39.420 }' 00:16:39.420 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:39.420 04:17:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.986 04:17:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:40.244 [2024-05-15 04:17:28.057937] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:40.244 [2024-05-15 04:17:28.057986] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b5c7f0 00:16:40.244 [2024-05-15 04:17:28.057995] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:40.244 [2024-05-15 04:17:28.058156] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d105f0 00:16:40.244 [2024-05-15 04:17:28.058288] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b5c7f0 00:16:40.244 [2024-05-15 04:17:28.058301] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b5c7f0 00:16:40.244 [2024-05-15 04:17:28.058492] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:40.244 BaseBdev4 00:16:40.244 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:16:40.244 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:40.245 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:40.245 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:40.245 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:40.245 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:40.245 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.502 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:40.761 [ 00:16:40.761 { 00:16:40.761 "name": "BaseBdev4", 00:16:40.761 "aliases": [ 00:16:40.761 "8f417f07-621b-48b1-a345-4067482bfb19" 00:16:40.761 ], 00:16:40.761 "product_name": "Malloc disk", 00:16:40.761 "block_size": 512, 00:16:40.761 "num_blocks": 65536, 00:16:40.761 "uuid": "8f417f07-621b-48b1-a345-4067482bfb19", 00:16:40.761 "assigned_rate_limits": { 00:16:40.761 "rw_ios_per_sec": 0, 00:16:40.761 "rw_mbytes_per_sec": 0, 00:16:40.761 "r_mbytes_per_sec": 0, 00:16:40.761 "w_mbytes_per_sec": 0 00:16:40.761 }, 00:16:40.761 "claimed": true, 00:16:40.761 "claim_type": "exclusive_write", 00:16:40.761 "zoned": false, 00:16:40.761 "supported_io_types": { 00:16:40.761 "read": true, 00:16:40.761 "write": true, 00:16:40.761 "unmap": true, 00:16:40.761 "write_zeroes": true, 00:16:40.761 "flush": true, 00:16:40.761 "reset": true, 00:16:40.761 "compare": false, 00:16:40.761 "compare_and_write": false, 00:16:40.761 "abort": true, 00:16:40.761 "nvme_admin": false, 00:16:40.761 "nvme_io": false 00:16:40.761 }, 00:16:40.761 "memory_domains": [ 00:16:40.761 { 00:16:40.761 "dma_device_id": "system", 00:16:40.761 "dma_device_type": 1 00:16:40.761 }, 00:16:40.761 { 00:16:40.761 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.761 "dma_device_type": 2 00:16:40.761 } 00:16:40.761 ], 00:16:40.761 "driver_specific": {} 00:16:40.761 } 00:16:40.761 ] 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.761 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.019 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:41.019 "name": "Existed_Raid", 00:16:41.019 "uuid": "a06ff211-446d-4f12-86b3-6c4ea99cb680", 00:16:41.019 "strip_size_kb": 0, 00:16:41.019 "state": "online", 00:16:41.019 "raid_level": "raid1", 00:16:41.019 "superblock": false, 00:16:41.019 "num_base_bdevs": 4, 00:16:41.019 "num_base_bdevs_discovered": 4, 00:16:41.019 "num_base_bdevs_operational": 4, 00:16:41.019 "base_bdevs_list": [ 00:16:41.019 { 00:16:41.019 "name": "BaseBdev1", 00:16:41.019 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:41.019 "is_configured": true, 00:16:41.019 "data_offset": 0, 00:16:41.019 "data_size": 65536 00:16:41.019 }, 00:16:41.019 { 00:16:41.019 "name": "BaseBdev2", 00:16:41.019 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:41.019 "is_configured": true, 00:16:41.019 "data_offset": 0, 00:16:41.019 "data_size": 65536 00:16:41.019 }, 00:16:41.019 { 00:16:41.019 "name": "BaseBdev3", 00:16:41.019 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:41.019 "is_configured": true, 00:16:41.019 "data_offset": 0, 00:16:41.019 "data_size": 65536 00:16:41.019 }, 00:16:41.019 { 00:16:41.019 "name": "BaseBdev4", 00:16:41.019 "uuid": "8f417f07-621b-48b1-a345-4067482bfb19", 00:16:41.019 "is_configured": true, 00:16:41.019 "data_offset": 0, 00:16:41.019 "data_size": 65536 00:16:41.019 } 00:16:41.019 ] 00:16:41.019 }' 00:16:41.019 04:17:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:41.019 04:17:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:16:41.585 [2024-05-15 04:17:29.542111] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:41.585 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:16:41.585 "name": "Existed_Raid", 00:16:41.585 "aliases": [ 00:16:41.585 "a06ff211-446d-4f12-86b3-6c4ea99cb680" 00:16:41.585 ], 00:16:41.585 "product_name": "Raid Volume", 00:16:41.585 "block_size": 512, 00:16:41.585 "num_blocks": 65536, 00:16:41.585 "uuid": "a06ff211-446d-4f12-86b3-6c4ea99cb680", 00:16:41.585 "assigned_rate_limits": { 00:16:41.585 "rw_ios_per_sec": 0, 00:16:41.585 "rw_mbytes_per_sec": 0, 00:16:41.585 "r_mbytes_per_sec": 0, 00:16:41.585 "w_mbytes_per_sec": 0 00:16:41.585 }, 00:16:41.585 "claimed": false, 00:16:41.585 "zoned": false, 00:16:41.585 "supported_io_types": { 00:16:41.585 "read": true, 00:16:41.585 "write": true, 00:16:41.585 "unmap": false, 00:16:41.585 "write_zeroes": true, 00:16:41.585 "flush": false, 00:16:41.585 "reset": true, 00:16:41.585 "compare": false, 00:16:41.585 "compare_and_write": false, 00:16:41.585 "abort": false, 00:16:41.585 "nvme_admin": false, 00:16:41.585 "nvme_io": false 00:16:41.585 }, 00:16:41.585 "memory_domains": [ 00:16:41.585 { 00:16:41.585 "dma_device_id": "system", 00:16:41.585 "dma_device_type": 1 00:16:41.585 }, 00:16:41.585 { 00:16:41.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.585 "dma_device_type": 2 00:16:41.585 }, 00:16:41.585 { 00:16:41.585 "dma_device_id": "system", 00:16:41.585 "dma_device_type": 1 00:16:41.585 }, 00:16:41.585 { 00:16:41.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.585 "dma_device_type": 2 00:16:41.585 }, 00:16:41.585 { 00:16:41.585 "dma_device_id": "system", 00:16:41.585 "dma_device_type": 1 00:16:41.585 }, 00:16:41.585 { 00:16:41.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.586 "dma_device_type": 2 00:16:41.586 }, 00:16:41.586 { 00:16:41.586 "dma_device_id": "system", 00:16:41.586 "dma_device_type": 1 00:16:41.586 }, 00:16:41.586 { 00:16:41.586 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.586 "dma_device_type": 2 00:16:41.586 } 00:16:41.586 ], 00:16:41.586 "driver_specific": { 00:16:41.586 "raid": { 00:16:41.586 "uuid": "a06ff211-446d-4f12-86b3-6c4ea99cb680", 00:16:41.586 "strip_size_kb": 0, 00:16:41.586 "state": "online", 00:16:41.586 "raid_level": "raid1", 00:16:41.586 "superblock": false, 00:16:41.586 "num_base_bdevs": 4, 00:16:41.586 "num_base_bdevs_discovered": 4, 00:16:41.586 "num_base_bdevs_operational": 4, 00:16:41.586 "base_bdevs_list": [ 00:16:41.586 { 00:16:41.586 "name": "BaseBdev1", 00:16:41.586 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:41.586 "is_configured": true, 00:16:41.586 "data_offset": 0, 00:16:41.586 "data_size": 65536 00:16:41.586 }, 00:16:41.586 { 00:16:41.586 "name": "BaseBdev2", 00:16:41.586 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:41.586 "is_configured": true, 00:16:41.586 "data_offset": 0, 00:16:41.586 "data_size": 65536 00:16:41.586 }, 00:16:41.586 { 00:16:41.586 "name": "BaseBdev3", 00:16:41.586 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:41.586 "is_configured": true, 00:16:41.586 "data_offset": 0, 00:16:41.586 "data_size": 65536 00:16:41.586 }, 00:16:41.586 { 00:16:41.586 "name": "BaseBdev4", 00:16:41.586 "uuid": "8f417f07-621b-48b1-a345-4067482bfb19", 00:16:41.586 "is_configured": true, 00:16:41.586 "data_offset": 0, 00:16:41.586 "data_size": 65536 00:16:41.586 } 00:16:41.586 ] 00:16:41.586 } 00:16:41.586 } 00:16:41.586 }' 00:16:41.586 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:41.586 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:16:41.586 BaseBdev2 00:16:41.586 BaseBdev3 00:16:41.586 BaseBdev4' 00:16:41.586 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:41.586 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:41.586 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:41.844 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:41.844 "name": "BaseBdev1", 00:16:41.844 "aliases": [ 00:16:41.844 "ced0cd18-25cb-4fc8-979f-0868b37f085e" 00:16:41.844 ], 00:16:41.844 "product_name": "Malloc disk", 00:16:41.844 "block_size": 512, 00:16:41.844 "num_blocks": 65536, 00:16:41.844 "uuid": "ced0cd18-25cb-4fc8-979f-0868b37f085e", 00:16:41.844 "assigned_rate_limits": { 00:16:41.844 "rw_ios_per_sec": 0, 00:16:41.844 "rw_mbytes_per_sec": 0, 00:16:41.844 "r_mbytes_per_sec": 0, 00:16:41.844 "w_mbytes_per_sec": 0 00:16:41.844 }, 00:16:41.844 "claimed": true, 00:16:41.844 "claim_type": "exclusive_write", 00:16:41.844 "zoned": false, 00:16:41.844 "supported_io_types": { 00:16:41.844 "read": true, 00:16:41.844 "write": true, 00:16:41.844 "unmap": true, 00:16:41.844 "write_zeroes": true, 00:16:41.844 "flush": true, 00:16:41.844 "reset": true, 00:16:41.844 "compare": false, 00:16:41.844 "compare_and_write": false, 00:16:41.844 "abort": true, 00:16:41.844 "nvme_admin": false, 00:16:41.844 "nvme_io": false 00:16:41.844 }, 00:16:41.844 "memory_domains": [ 00:16:41.844 { 00:16:41.844 "dma_device_id": "system", 00:16:41.844 "dma_device_type": 1 00:16:41.844 }, 00:16:41.844 { 00:16:41.844 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.844 "dma_device_type": 2 00:16:41.844 } 00:16:41.844 ], 00:16:41.844 "driver_specific": {} 00:16:41.844 }' 00:16:41.844 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:42.101 04:17:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:42.101 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:42.101 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:42.101 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:42.101 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:42.358 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:42.358 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:42.358 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:42.358 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:42.617 "name": "BaseBdev2", 00:16:42.617 "aliases": [ 00:16:42.617 "f975b1eb-432c-4410-b5ca-d303b1a5e775" 00:16:42.617 ], 00:16:42.617 "product_name": "Malloc disk", 00:16:42.617 "block_size": 512, 00:16:42.617 "num_blocks": 65536, 00:16:42.617 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:42.617 "assigned_rate_limits": { 00:16:42.617 "rw_ios_per_sec": 0, 00:16:42.617 "rw_mbytes_per_sec": 0, 00:16:42.617 "r_mbytes_per_sec": 0, 00:16:42.617 "w_mbytes_per_sec": 0 00:16:42.617 }, 00:16:42.617 "claimed": true, 00:16:42.617 "claim_type": "exclusive_write", 00:16:42.617 "zoned": false, 00:16:42.617 "supported_io_types": { 00:16:42.617 "read": true, 00:16:42.617 "write": true, 00:16:42.617 "unmap": true, 00:16:42.617 "write_zeroes": true, 00:16:42.617 "flush": true, 00:16:42.617 "reset": true, 00:16:42.617 "compare": false, 00:16:42.617 "compare_and_write": false, 00:16:42.617 "abort": true, 00:16:42.617 "nvme_admin": false, 00:16:42.617 "nvme_io": false 00:16:42.617 }, 00:16:42.617 "memory_domains": [ 00:16:42.617 { 00:16:42.617 "dma_device_id": "system", 00:16:42.617 "dma_device_type": 1 00:16:42.617 }, 00:16:42.617 { 00:16:42.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.617 "dma_device_type": 2 00:16:42.617 } 00:16:42.617 ], 00:16:42.617 "driver_specific": {} 00:16:42.617 }' 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:42.617 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:42.874 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:42.874 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:42.874 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:42.874 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:42.875 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:43.132 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:43.132 "name": "BaseBdev3", 00:16:43.132 "aliases": [ 00:16:43.132 "d282325b-ab85-4663-951c-37d7fdeb88e5" 00:16:43.132 ], 00:16:43.132 "product_name": "Malloc disk", 00:16:43.132 "block_size": 512, 00:16:43.132 "num_blocks": 65536, 00:16:43.132 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:43.132 "assigned_rate_limits": { 00:16:43.132 "rw_ios_per_sec": 0, 00:16:43.132 "rw_mbytes_per_sec": 0, 00:16:43.132 "r_mbytes_per_sec": 0, 00:16:43.132 "w_mbytes_per_sec": 0 00:16:43.132 }, 00:16:43.132 "claimed": true, 00:16:43.132 "claim_type": "exclusive_write", 00:16:43.132 "zoned": false, 00:16:43.132 "supported_io_types": { 00:16:43.132 "read": true, 00:16:43.132 "write": true, 00:16:43.132 "unmap": true, 00:16:43.132 "write_zeroes": true, 00:16:43.132 "flush": true, 00:16:43.132 "reset": true, 00:16:43.132 "compare": false, 00:16:43.132 "compare_and_write": false, 00:16:43.132 "abort": true, 00:16:43.132 "nvme_admin": false, 00:16:43.132 "nvme_io": false 00:16:43.132 }, 00:16:43.132 "memory_domains": [ 00:16:43.132 { 00:16:43.132 "dma_device_id": "system", 00:16:43.132 "dma_device_type": 1 00:16:43.132 }, 00:16:43.132 { 00:16:43.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.132 "dma_device_type": 2 00:16:43.132 } 00:16:43.132 ], 00:16:43.132 "driver_specific": {} 00:16:43.132 }' 00:16:43.132 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.132 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.132 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:43.132 04:17:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.132 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.390 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:43.390 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:16:43.390 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:43.390 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:16:43.648 "name": "BaseBdev4", 00:16:43.648 "aliases": [ 00:16:43.648 "8f417f07-621b-48b1-a345-4067482bfb19" 00:16:43.648 ], 00:16:43.648 "product_name": "Malloc disk", 00:16:43.648 "block_size": 512, 00:16:43.648 "num_blocks": 65536, 00:16:43.648 "uuid": "8f417f07-621b-48b1-a345-4067482bfb19", 00:16:43.648 "assigned_rate_limits": { 00:16:43.648 "rw_ios_per_sec": 0, 00:16:43.648 "rw_mbytes_per_sec": 0, 00:16:43.648 "r_mbytes_per_sec": 0, 00:16:43.648 "w_mbytes_per_sec": 0 00:16:43.648 }, 00:16:43.648 "claimed": true, 00:16:43.648 "claim_type": "exclusive_write", 00:16:43.648 "zoned": false, 00:16:43.648 "supported_io_types": { 00:16:43.648 "read": true, 00:16:43.648 "write": true, 00:16:43.648 "unmap": true, 00:16:43.648 "write_zeroes": true, 00:16:43.648 "flush": true, 00:16:43.648 "reset": true, 00:16:43.648 "compare": false, 00:16:43.648 "compare_and_write": false, 00:16:43.648 "abort": true, 00:16:43.648 "nvme_admin": false, 00:16:43.648 "nvme_io": false 00:16:43.648 }, 00:16:43.648 "memory_domains": [ 00:16:43.648 { 00:16:43.648 "dma_device_id": "system", 00:16:43.648 "dma_device_type": 1 00:16:43.648 }, 00:16:43.648 { 00:16:43.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.648 "dma_device_type": 2 00:16:43.648 } 00:16:43.648 ], 00:16:43.648 "driver_specific": {} 00:16:43.648 }' 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.648 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:43.906 [2024-05-15 04:17:31.896201] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # local expected_state 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 0 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.906 04:17:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.164 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:44.164 "name": "Existed_Raid", 00:16:44.164 "uuid": "a06ff211-446d-4f12-86b3-6c4ea99cb680", 00:16:44.164 "strip_size_kb": 0, 00:16:44.164 "state": "online", 00:16:44.164 "raid_level": "raid1", 00:16:44.164 "superblock": false, 00:16:44.164 "num_base_bdevs": 4, 00:16:44.164 "num_base_bdevs_discovered": 3, 00:16:44.164 "num_base_bdevs_operational": 3, 00:16:44.164 "base_bdevs_list": [ 00:16:44.164 { 00:16:44.164 "name": null, 00:16:44.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.164 "is_configured": false, 00:16:44.164 "data_offset": 0, 00:16:44.164 "data_size": 65536 00:16:44.164 }, 00:16:44.164 { 00:16:44.164 "name": "BaseBdev2", 00:16:44.164 "uuid": "f975b1eb-432c-4410-b5ca-d303b1a5e775", 00:16:44.164 "is_configured": true, 00:16:44.164 "data_offset": 0, 00:16:44.164 "data_size": 65536 00:16:44.164 }, 00:16:44.164 { 00:16:44.164 "name": "BaseBdev3", 00:16:44.164 "uuid": "d282325b-ab85-4663-951c-37d7fdeb88e5", 00:16:44.164 "is_configured": true, 00:16:44.164 "data_offset": 0, 00:16:44.164 "data_size": 65536 00:16:44.164 }, 00:16:44.164 { 00:16:44.164 "name": "BaseBdev4", 00:16:44.164 "uuid": "8f417f07-621b-48b1-a345-4067482bfb19", 00:16:44.164 "is_configured": true, 00:16:44.164 "data_offset": 0, 00:16:44.164 "data_size": 65536 00:16:44.164 } 00:16:44.164 ] 00:16:44.164 }' 00:16:44.164 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:44.164 04:17:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.729 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:16:44.729 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:44.729 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.729 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:44.986 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:44.986 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:44.986 04:17:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:45.244 [2024-05-15 04:17:33.145917] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:45.244 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:45.244 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:45.244 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.244 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:45.501 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:45.501 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:45.501 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:45.757 [2024-05-15 04:17:33.648619] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:45.757 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:45.757 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:45.757 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.757 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:16:46.014 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:16:46.014 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:46.014 04:17:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:46.272 [2024-05-15 04:17:34.145388] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:46.272 [2024-05-15 04:17:34.145473] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:46.272 [2024-05-15 04:17:34.158329] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:46.272 [2024-05-15 04:17:34.158383] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:46.272 [2024-05-15 04:17:34.158395] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b5c7f0 name Existed_Raid, state offline 00:16:46.272 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:16:46.272 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:16:46.272 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.272 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:46.529 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:46.787 BaseBdev2 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:46.787 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.050 04:17:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:47.308 [ 00:16:47.308 { 00:16:47.308 "name": "BaseBdev2", 00:16:47.308 "aliases": [ 00:16:47.309 "b6740475-e5be-4922-9c15-0b2c9d1e655b" 00:16:47.309 ], 00:16:47.309 "product_name": "Malloc disk", 00:16:47.309 "block_size": 512, 00:16:47.309 "num_blocks": 65536, 00:16:47.309 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:47.309 "assigned_rate_limits": { 00:16:47.309 "rw_ios_per_sec": 0, 00:16:47.309 "rw_mbytes_per_sec": 0, 00:16:47.309 "r_mbytes_per_sec": 0, 00:16:47.309 "w_mbytes_per_sec": 0 00:16:47.309 }, 00:16:47.309 "claimed": false, 00:16:47.309 "zoned": false, 00:16:47.309 "supported_io_types": { 00:16:47.309 "read": true, 00:16:47.309 "write": true, 00:16:47.309 "unmap": true, 00:16:47.309 "write_zeroes": true, 00:16:47.309 "flush": true, 00:16:47.309 "reset": true, 00:16:47.309 "compare": false, 00:16:47.309 "compare_and_write": false, 00:16:47.309 "abort": true, 00:16:47.309 "nvme_admin": false, 00:16:47.309 "nvme_io": false 00:16:47.309 }, 00:16:47.309 "memory_domains": [ 00:16:47.309 { 00:16:47.309 "dma_device_id": "system", 00:16:47.309 "dma_device_type": 1 00:16:47.309 }, 00:16:47.309 { 00:16:47.309 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.309 "dma_device_type": 2 00:16:47.309 } 00:16:47.309 ], 00:16:47.309 "driver_specific": {} 00:16:47.309 } 00:16:47.309 ] 00:16:47.309 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:47.309 04:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:47.309 04:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:47.309 04:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:47.566 BaseBdev3 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:47.566 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.824 04:17:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:48.081 [ 00:16:48.081 { 00:16:48.081 "name": "BaseBdev3", 00:16:48.081 "aliases": [ 00:16:48.081 "6e56ab07-1780-438e-aa29-175b5167ebe9" 00:16:48.081 ], 00:16:48.081 "product_name": "Malloc disk", 00:16:48.081 "block_size": 512, 00:16:48.081 "num_blocks": 65536, 00:16:48.081 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:48.081 "assigned_rate_limits": { 00:16:48.081 "rw_ios_per_sec": 0, 00:16:48.081 "rw_mbytes_per_sec": 0, 00:16:48.081 "r_mbytes_per_sec": 0, 00:16:48.081 "w_mbytes_per_sec": 0 00:16:48.081 }, 00:16:48.081 "claimed": false, 00:16:48.081 "zoned": false, 00:16:48.081 "supported_io_types": { 00:16:48.081 "read": true, 00:16:48.081 "write": true, 00:16:48.081 "unmap": true, 00:16:48.081 "write_zeroes": true, 00:16:48.081 "flush": true, 00:16:48.081 "reset": true, 00:16:48.081 "compare": false, 00:16:48.081 "compare_and_write": false, 00:16:48.081 "abort": true, 00:16:48.081 "nvme_admin": false, 00:16:48.081 "nvme_io": false 00:16:48.081 }, 00:16:48.081 "memory_domains": [ 00:16:48.081 { 00:16:48.081 "dma_device_id": "system", 00:16:48.081 "dma_device_type": 1 00:16:48.081 }, 00:16:48.081 { 00:16:48.081 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.081 "dma_device_type": 2 00:16:48.081 } 00:16:48.081 ], 00:16:48.081 "driver_specific": {} 00:16:48.081 } 00:16:48.081 ] 00:16:48.081 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:48.081 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:48.081 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:48.081 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:48.339 BaseBdev4 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:48.339 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.597 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:48.855 [ 00:16:48.855 { 00:16:48.855 "name": "BaseBdev4", 00:16:48.855 "aliases": [ 00:16:48.855 "f8c94d46-3476-41dd-8085-2163b5e45a74" 00:16:48.855 ], 00:16:48.855 "product_name": "Malloc disk", 00:16:48.855 "block_size": 512, 00:16:48.855 "num_blocks": 65536, 00:16:48.855 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:48.855 "assigned_rate_limits": { 00:16:48.855 "rw_ios_per_sec": 0, 00:16:48.855 "rw_mbytes_per_sec": 0, 00:16:48.855 "r_mbytes_per_sec": 0, 00:16:48.855 "w_mbytes_per_sec": 0 00:16:48.855 }, 00:16:48.855 "claimed": false, 00:16:48.855 "zoned": false, 00:16:48.855 "supported_io_types": { 00:16:48.855 "read": true, 00:16:48.855 "write": true, 00:16:48.855 "unmap": true, 00:16:48.855 "write_zeroes": true, 00:16:48.855 "flush": true, 00:16:48.855 "reset": true, 00:16:48.855 "compare": false, 00:16:48.855 "compare_and_write": false, 00:16:48.855 "abort": true, 00:16:48.855 "nvme_admin": false, 00:16:48.855 "nvme_io": false 00:16:48.855 }, 00:16:48.855 "memory_domains": [ 00:16:48.855 { 00:16:48.855 "dma_device_id": "system", 00:16:48.855 "dma_device_type": 1 00:16:48.855 }, 00:16:48.855 { 00:16:48.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.855 "dma_device_type": 2 00:16:48.855 } 00:16:48.855 ], 00:16:48.855 "driver_specific": {} 00:16:48.855 } 00:16:48.855 ] 00:16:48.855 04:17:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:48.855 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:16:48.855 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:16:48.855 04:17:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:49.113 [2024-05-15 04:17:37.070441] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:49.113 [2024-05-15 04:17:37.070485] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:49.113 [2024-05-15 04:17:37.070509] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:49.113 [2024-05-15 04:17:37.071753] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:49.113 [2024-05-15 04:17:37.071795] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.113 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.371 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:49.371 "name": "Existed_Raid", 00:16:49.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.371 "strip_size_kb": 0, 00:16:49.371 "state": "configuring", 00:16:49.371 "raid_level": "raid1", 00:16:49.371 "superblock": false, 00:16:49.371 "num_base_bdevs": 4, 00:16:49.371 "num_base_bdevs_discovered": 3, 00:16:49.371 "num_base_bdevs_operational": 4, 00:16:49.371 "base_bdevs_list": [ 00:16:49.371 { 00:16:49.371 "name": "BaseBdev1", 00:16:49.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.371 "is_configured": false, 00:16:49.371 "data_offset": 0, 00:16:49.371 "data_size": 0 00:16:49.371 }, 00:16:49.371 { 00:16:49.371 "name": "BaseBdev2", 00:16:49.371 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:49.371 "is_configured": true, 00:16:49.371 "data_offset": 0, 00:16:49.371 "data_size": 65536 00:16:49.371 }, 00:16:49.371 { 00:16:49.371 "name": "BaseBdev3", 00:16:49.371 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:49.371 "is_configured": true, 00:16:49.371 "data_offset": 0, 00:16:49.371 "data_size": 65536 00:16:49.371 }, 00:16:49.371 { 00:16:49.371 "name": "BaseBdev4", 00:16:49.371 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:49.371 "is_configured": true, 00:16:49.371 "data_offset": 0, 00:16:49.371 "data_size": 65536 00:16:49.371 } 00:16:49.371 ] 00:16:49.371 }' 00:16:49.371 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:49.371 04:17:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.936 04:17:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:50.194 [2024-05-15 04:17:38.137289] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.194 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.452 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:50.452 "name": "Existed_Raid", 00:16:50.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.452 "strip_size_kb": 0, 00:16:50.452 "state": "configuring", 00:16:50.452 "raid_level": "raid1", 00:16:50.452 "superblock": false, 00:16:50.452 "num_base_bdevs": 4, 00:16:50.452 "num_base_bdevs_discovered": 2, 00:16:50.452 "num_base_bdevs_operational": 4, 00:16:50.452 "base_bdevs_list": [ 00:16:50.452 { 00:16:50.452 "name": "BaseBdev1", 00:16:50.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:50.452 "is_configured": false, 00:16:50.452 "data_offset": 0, 00:16:50.452 "data_size": 0 00:16:50.452 }, 00:16:50.452 { 00:16:50.452 "name": null, 00:16:50.452 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:50.452 "is_configured": false, 00:16:50.452 "data_offset": 0, 00:16:50.452 "data_size": 65536 00:16:50.452 }, 00:16:50.452 { 00:16:50.452 "name": "BaseBdev3", 00:16:50.452 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:50.452 "is_configured": true, 00:16:50.453 "data_offset": 0, 00:16:50.453 "data_size": 65536 00:16:50.453 }, 00:16:50.453 { 00:16:50.453 "name": "BaseBdev4", 00:16:50.453 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:50.453 "is_configured": true, 00:16:50.453 "data_offset": 0, 00:16:50.453 "data_size": 65536 00:16:50.453 } 00:16:50.453 ] 00:16:50.453 }' 00:16:50.453 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:50.453 04:17:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.018 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.018 04:17:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:51.276 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:16:51.276 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:51.534 [2024-05-15 04:17:39.442958] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:51.534 BaseBdev1 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:51.534 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:51.792 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:52.050 [ 00:16:52.050 { 00:16:52.050 "name": "BaseBdev1", 00:16:52.050 "aliases": [ 00:16:52.050 "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b" 00:16:52.050 ], 00:16:52.050 "product_name": "Malloc disk", 00:16:52.050 "block_size": 512, 00:16:52.050 "num_blocks": 65536, 00:16:52.050 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:52.050 "assigned_rate_limits": { 00:16:52.050 "rw_ios_per_sec": 0, 00:16:52.050 "rw_mbytes_per_sec": 0, 00:16:52.050 "r_mbytes_per_sec": 0, 00:16:52.050 "w_mbytes_per_sec": 0 00:16:52.050 }, 00:16:52.050 "claimed": true, 00:16:52.050 "claim_type": "exclusive_write", 00:16:52.050 "zoned": false, 00:16:52.050 "supported_io_types": { 00:16:52.050 "read": true, 00:16:52.050 "write": true, 00:16:52.050 "unmap": true, 00:16:52.050 "write_zeroes": true, 00:16:52.050 "flush": true, 00:16:52.051 "reset": true, 00:16:52.051 "compare": false, 00:16:52.051 "compare_and_write": false, 00:16:52.051 "abort": true, 00:16:52.051 "nvme_admin": false, 00:16:52.051 "nvme_io": false 00:16:52.051 }, 00:16:52.051 "memory_domains": [ 00:16:52.051 { 00:16:52.051 "dma_device_id": "system", 00:16:52.051 "dma_device_type": 1 00:16:52.051 }, 00:16:52.051 { 00:16:52.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.051 "dma_device_type": 2 00:16:52.051 } 00:16:52.051 ], 00:16:52.051 "driver_specific": {} 00:16:52.051 } 00:16:52.051 ] 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.051 04:17:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.310 04:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:52.310 "name": "Existed_Raid", 00:16:52.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.310 "strip_size_kb": 0, 00:16:52.310 "state": "configuring", 00:16:52.310 "raid_level": "raid1", 00:16:52.310 "superblock": false, 00:16:52.310 "num_base_bdevs": 4, 00:16:52.310 "num_base_bdevs_discovered": 3, 00:16:52.310 "num_base_bdevs_operational": 4, 00:16:52.310 "base_bdevs_list": [ 00:16:52.310 { 00:16:52.310 "name": "BaseBdev1", 00:16:52.310 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:52.310 "is_configured": true, 00:16:52.310 "data_offset": 0, 00:16:52.310 "data_size": 65536 00:16:52.310 }, 00:16:52.310 { 00:16:52.310 "name": null, 00:16:52.310 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:52.310 "is_configured": false, 00:16:52.310 "data_offset": 0, 00:16:52.310 "data_size": 65536 00:16:52.310 }, 00:16:52.310 { 00:16:52.310 "name": "BaseBdev3", 00:16:52.310 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:52.310 "is_configured": true, 00:16:52.310 "data_offset": 0, 00:16:52.310 "data_size": 65536 00:16:52.310 }, 00:16:52.310 { 00:16:52.310 "name": "BaseBdev4", 00:16:52.310 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:52.310 "is_configured": true, 00:16:52.310 "data_offset": 0, 00:16:52.310 "data_size": 65536 00:16:52.310 } 00:16:52.310 ] 00:16:52.310 }' 00:16:52.310 04:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:52.310 04:17:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.876 04:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.877 04:17:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:53.134 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:16:53.134 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:53.392 [2024-05-15 04:17:41.263797] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.392 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.650 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:53.650 "name": "Existed_Raid", 00:16:53.650 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.650 "strip_size_kb": 0, 00:16:53.650 "state": "configuring", 00:16:53.650 "raid_level": "raid1", 00:16:53.650 "superblock": false, 00:16:53.650 "num_base_bdevs": 4, 00:16:53.650 "num_base_bdevs_discovered": 2, 00:16:53.650 "num_base_bdevs_operational": 4, 00:16:53.650 "base_bdevs_list": [ 00:16:53.650 { 00:16:53.650 "name": "BaseBdev1", 00:16:53.650 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:53.650 "is_configured": true, 00:16:53.650 "data_offset": 0, 00:16:53.650 "data_size": 65536 00:16:53.650 }, 00:16:53.650 { 00:16:53.650 "name": null, 00:16:53.650 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:53.650 "is_configured": false, 00:16:53.650 "data_offset": 0, 00:16:53.650 "data_size": 65536 00:16:53.650 }, 00:16:53.650 { 00:16:53.650 "name": null, 00:16:53.650 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:53.650 "is_configured": false, 00:16:53.650 "data_offset": 0, 00:16:53.650 "data_size": 65536 00:16:53.650 }, 00:16:53.650 { 00:16:53.650 "name": "BaseBdev4", 00:16:53.650 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:53.650 "is_configured": true, 00:16:53.650 "data_offset": 0, 00:16:53.650 "data_size": 65536 00:16:53.650 } 00:16:53.650 ] 00:16:53.650 }' 00:16:53.650 04:17:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:53.650 04:17:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.214 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.214 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:54.471 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:16:54.471 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:54.729 [2024-05-15 04:17:42.535345] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.729 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.986 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:54.986 "name": "Existed_Raid", 00:16:54.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.986 "strip_size_kb": 0, 00:16:54.986 "state": "configuring", 00:16:54.986 "raid_level": "raid1", 00:16:54.986 "superblock": false, 00:16:54.986 "num_base_bdevs": 4, 00:16:54.986 "num_base_bdevs_discovered": 3, 00:16:54.986 "num_base_bdevs_operational": 4, 00:16:54.986 "base_bdevs_list": [ 00:16:54.986 { 00:16:54.986 "name": "BaseBdev1", 00:16:54.986 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:54.986 "is_configured": true, 00:16:54.986 "data_offset": 0, 00:16:54.986 "data_size": 65536 00:16:54.986 }, 00:16:54.986 { 00:16:54.986 "name": null, 00:16:54.986 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:54.986 "is_configured": false, 00:16:54.986 "data_offset": 0, 00:16:54.986 "data_size": 65536 00:16:54.986 }, 00:16:54.986 { 00:16:54.986 "name": "BaseBdev3", 00:16:54.986 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:54.986 "is_configured": true, 00:16:54.986 "data_offset": 0, 00:16:54.986 "data_size": 65536 00:16:54.986 }, 00:16:54.986 { 00:16:54.986 "name": "BaseBdev4", 00:16:54.986 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:54.986 "is_configured": true, 00:16:54.986 "data_offset": 0, 00:16:54.986 "data_size": 65536 00:16:54.986 } 00:16:54.986 ] 00:16:54.986 }' 00:16:54.986 04:17:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:54.986 04:17:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.552 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.552 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:55.810 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:16:55.810 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:55.810 [2024-05-15 04:17:43.810789] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.067 04:17:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.325 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:56.325 "name": "Existed_Raid", 00:16:56.325 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.325 "strip_size_kb": 0, 00:16:56.325 "state": "configuring", 00:16:56.325 "raid_level": "raid1", 00:16:56.325 "superblock": false, 00:16:56.325 "num_base_bdevs": 4, 00:16:56.325 "num_base_bdevs_discovered": 2, 00:16:56.325 "num_base_bdevs_operational": 4, 00:16:56.325 "base_bdevs_list": [ 00:16:56.325 { 00:16:56.325 "name": null, 00:16:56.325 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:56.325 "is_configured": false, 00:16:56.325 "data_offset": 0, 00:16:56.325 "data_size": 65536 00:16:56.325 }, 00:16:56.325 { 00:16:56.325 "name": null, 00:16:56.325 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:56.325 "is_configured": false, 00:16:56.325 "data_offset": 0, 00:16:56.325 "data_size": 65536 00:16:56.325 }, 00:16:56.325 { 00:16:56.325 "name": "BaseBdev3", 00:16:56.325 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:56.325 "is_configured": true, 00:16:56.325 "data_offset": 0, 00:16:56.325 "data_size": 65536 00:16:56.325 }, 00:16:56.325 { 00:16:56.325 "name": "BaseBdev4", 00:16:56.325 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:56.325 "is_configured": true, 00:16:56.325 "data_offset": 0, 00:16:56.325 "data_size": 65536 00:16:56.325 } 00:16:56.325 ] 00:16:56.325 }' 00:16:56.325 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:56.325 04:17:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.891 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.891 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:56.891 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:16:56.891 04:17:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:57.148 [2024-05-15 04:17:45.156447] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.406 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.664 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:57.664 "name": "Existed_Raid", 00:16:57.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:57.664 "strip_size_kb": 0, 00:16:57.664 "state": "configuring", 00:16:57.664 "raid_level": "raid1", 00:16:57.664 "superblock": false, 00:16:57.664 "num_base_bdevs": 4, 00:16:57.664 "num_base_bdevs_discovered": 3, 00:16:57.664 "num_base_bdevs_operational": 4, 00:16:57.664 "base_bdevs_list": [ 00:16:57.664 { 00:16:57.664 "name": null, 00:16:57.664 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:57.664 "is_configured": false, 00:16:57.664 "data_offset": 0, 00:16:57.664 "data_size": 65536 00:16:57.664 }, 00:16:57.664 { 00:16:57.664 "name": "BaseBdev2", 00:16:57.664 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:57.664 "is_configured": true, 00:16:57.664 "data_offset": 0, 00:16:57.664 "data_size": 65536 00:16:57.664 }, 00:16:57.664 { 00:16:57.664 "name": "BaseBdev3", 00:16:57.664 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:57.664 "is_configured": true, 00:16:57.664 "data_offset": 0, 00:16:57.664 "data_size": 65536 00:16:57.664 }, 00:16:57.664 { 00:16:57.664 "name": "BaseBdev4", 00:16:57.664 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:57.664 "is_configured": true, 00:16:57.664 "data_offset": 0, 00:16:57.664 "data_size": 65536 00:16:57.664 } 00:16:57.664 ] 00:16:57.664 }' 00:16:57.664 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:57.664 04:17:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.231 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.231 04:17:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:58.231 04:17:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:16:58.231 04:17:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.231 04:17:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:58.525 04:17:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b 00:16:58.804 [2024-05-15 04:17:46.701145] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:58.804 [2024-05-15 04:17:46.701195] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d021d0 00:16:58.804 [2024-05-15 04:17:46.701205] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:58.804 [2024-05-15 04:17:46.701387] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d05640 00:16:58.804 [2024-05-15 04:17:46.701525] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d021d0 00:16:58.804 [2024-05-15 04:17:46.701538] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d021d0 00:16:58.804 [2024-05-15 04:17:46.701752] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:58.804 NewBaseBdev 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local i 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:16:58.804 04:17:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:59.062 04:17:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:59.320 [ 00:16:59.320 { 00:16:59.320 "name": "NewBaseBdev", 00:16:59.320 "aliases": [ 00:16:59.320 "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b" 00:16:59.320 ], 00:16:59.320 "product_name": "Malloc disk", 00:16:59.320 "block_size": 512, 00:16:59.320 "num_blocks": 65536, 00:16:59.320 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:59.320 "assigned_rate_limits": { 00:16:59.320 "rw_ios_per_sec": 0, 00:16:59.320 "rw_mbytes_per_sec": 0, 00:16:59.320 "r_mbytes_per_sec": 0, 00:16:59.320 "w_mbytes_per_sec": 0 00:16:59.320 }, 00:16:59.320 "claimed": true, 00:16:59.320 "claim_type": "exclusive_write", 00:16:59.320 "zoned": false, 00:16:59.320 "supported_io_types": { 00:16:59.320 "read": true, 00:16:59.320 "write": true, 00:16:59.320 "unmap": true, 00:16:59.320 "write_zeroes": true, 00:16:59.320 "flush": true, 00:16:59.320 "reset": true, 00:16:59.320 "compare": false, 00:16:59.320 "compare_and_write": false, 00:16:59.320 "abort": true, 00:16:59.320 "nvme_admin": false, 00:16:59.320 "nvme_io": false 00:16:59.320 }, 00:16:59.320 "memory_domains": [ 00:16:59.320 { 00:16:59.320 "dma_device_id": "system", 00:16:59.320 "dma_device_type": 1 00:16:59.320 }, 00:16:59.320 { 00:16:59.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.320 "dma_device_type": 2 00:16:59.320 } 00:16:59.320 ], 00:16:59.320 "driver_specific": {} 00:16:59.320 } 00:16:59.320 ] 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # return 0 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.320 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.579 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:16:59.579 "name": "Existed_Raid", 00:16:59.579 "uuid": "eefcebb8-5de4-45ae-accb-ef08ea5edd3b", 00:16:59.579 "strip_size_kb": 0, 00:16:59.579 "state": "online", 00:16:59.579 "raid_level": "raid1", 00:16:59.579 "superblock": false, 00:16:59.579 "num_base_bdevs": 4, 00:16:59.579 "num_base_bdevs_discovered": 4, 00:16:59.579 "num_base_bdevs_operational": 4, 00:16:59.579 "base_bdevs_list": [ 00:16:59.579 { 00:16:59.579 "name": "NewBaseBdev", 00:16:59.579 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:16:59.579 "is_configured": true, 00:16:59.579 "data_offset": 0, 00:16:59.579 "data_size": 65536 00:16:59.579 }, 00:16:59.579 { 00:16:59.579 "name": "BaseBdev2", 00:16:59.579 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:16:59.579 "is_configured": true, 00:16:59.579 "data_offset": 0, 00:16:59.579 "data_size": 65536 00:16:59.579 }, 00:16:59.579 { 00:16:59.579 "name": "BaseBdev3", 00:16:59.579 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:16:59.579 "is_configured": true, 00:16:59.579 "data_offset": 0, 00:16:59.579 "data_size": 65536 00:16:59.579 }, 00:16:59.579 { 00:16:59.579 "name": "BaseBdev4", 00:16:59.579 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:16:59.579 "is_configured": true, 00:16:59.579 "data_offset": 0, 00:16:59.579 "data_size": 65536 00:16:59.579 } 00:16:59.579 ] 00:16:59.579 }' 00:16:59.579 04:17:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:16:59.579 04:17:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:00.144 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:00.402 [2024-05-15 04:17:48.289762] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:00.402 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:00.402 "name": "Existed_Raid", 00:17:00.402 "aliases": [ 00:17:00.402 "eefcebb8-5de4-45ae-accb-ef08ea5edd3b" 00:17:00.402 ], 00:17:00.402 "product_name": "Raid Volume", 00:17:00.402 "block_size": 512, 00:17:00.402 "num_blocks": 65536, 00:17:00.402 "uuid": "eefcebb8-5de4-45ae-accb-ef08ea5edd3b", 00:17:00.402 "assigned_rate_limits": { 00:17:00.402 "rw_ios_per_sec": 0, 00:17:00.402 "rw_mbytes_per_sec": 0, 00:17:00.402 "r_mbytes_per_sec": 0, 00:17:00.402 "w_mbytes_per_sec": 0 00:17:00.402 }, 00:17:00.402 "claimed": false, 00:17:00.402 "zoned": false, 00:17:00.402 "supported_io_types": { 00:17:00.402 "read": true, 00:17:00.402 "write": true, 00:17:00.402 "unmap": false, 00:17:00.402 "write_zeroes": true, 00:17:00.402 "flush": false, 00:17:00.402 "reset": true, 00:17:00.402 "compare": false, 00:17:00.402 "compare_and_write": false, 00:17:00.402 "abort": false, 00:17:00.402 "nvme_admin": false, 00:17:00.402 "nvme_io": false 00:17:00.402 }, 00:17:00.402 "memory_domains": [ 00:17:00.402 { 00:17:00.402 "dma_device_id": "system", 00:17:00.402 "dma_device_type": 1 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.402 "dma_device_type": 2 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "system", 00:17:00.402 "dma_device_type": 1 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.402 "dma_device_type": 2 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "system", 00:17:00.402 "dma_device_type": 1 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.402 "dma_device_type": 2 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "system", 00:17:00.402 "dma_device_type": 1 00:17:00.402 }, 00:17:00.402 { 00:17:00.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.402 "dma_device_type": 2 00:17:00.402 } 00:17:00.402 ], 00:17:00.402 "driver_specific": { 00:17:00.402 "raid": { 00:17:00.402 "uuid": "eefcebb8-5de4-45ae-accb-ef08ea5edd3b", 00:17:00.402 "strip_size_kb": 0, 00:17:00.402 "state": "online", 00:17:00.402 "raid_level": "raid1", 00:17:00.402 "superblock": false, 00:17:00.402 "num_base_bdevs": 4, 00:17:00.402 "num_base_bdevs_discovered": 4, 00:17:00.402 "num_base_bdevs_operational": 4, 00:17:00.402 "base_bdevs_list": [ 00:17:00.402 { 00:17:00.402 "name": "NewBaseBdev", 00:17:00.402 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:17:00.402 "is_configured": true, 00:17:00.402 "data_offset": 0, 00:17:00.402 "data_size": 65536 00:17:00.403 }, 00:17:00.403 { 00:17:00.403 "name": "BaseBdev2", 00:17:00.403 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:17:00.403 "is_configured": true, 00:17:00.403 "data_offset": 0, 00:17:00.403 "data_size": 65536 00:17:00.403 }, 00:17:00.403 { 00:17:00.403 "name": "BaseBdev3", 00:17:00.403 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:17:00.403 "is_configured": true, 00:17:00.403 "data_offset": 0, 00:17:00.403 "data_size": 65536 00:17:00.403 }, 00:17:00.403 { 00:17:00.403 "name": "BaseBdev4", 00:17:00.403 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:17:00.403 "is_configured": true, 00:17:00.403 "data_offset": 0, 00:17:00.403 "data_size": 65536 00:17:00.403 } 00:17:00.403 ] 00:17:00.403 } 00:17:00.403 } 00:17:00.403 }' 00:17:00.403 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:00.403 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:17:00.403 BaseBdev2 00:17:00.403 BaseBdev3 00:17:00.403 BaseBdev4' 00:17:00.403 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.403 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:00.403 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:00.661 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:00.661 "name": "NewBaseBdev", 00:17:00.661 "aliases": [ 00:17:00.661 "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b" 00:17:00.661 ], 00:17:00.661 "product_name": "Malloc disk", 00:17:00.661 "block_size": 512, 00:17:00.661 "num_blocks": 65536, 00:17:00.661 "uuid": "b9c26322-0fe2-4f95-b7bc-4fb9bbf6481b", 00:17:00.661 "assigned_rate_limits": { 00:17:00.661 "rw_ios_per_sec": 0, 00:17:00.661 "rw_mbytes_per_sec": 0, 00:17:00.661 "r_mbytes_per_sec": 0, 00:17:00.661 "w_mbytes_per_sec": 0 00:17:00.661 }, 00:17:00.661 "claimed": true, 00:17:00.661 "claim_type": "exclusive_write", 00:17:00.661 "zoned": false, 00:17:00.661 "supported_io_types": { 00:17:00.661 "read": true, 00:17:00.661 "write": true, 00:17:00.661 "unmap": true, 00:17:00.661 "write_zeroes": true, 00:17:00.661 "flush": true, 00:17:00.661 "reset": true, 00:17:00.661 "compare": false, 00:17:00.661 "compare_and_write": false, 00:17:00.661 "abort": true, 00:17:00.661 "nvme_admin": false, 00:17:00.661 "nvme_io": false 00:17:00.661 }, 00:17:00.661 "memory_domains": [ 00:17:00.661 { 00:17:00.661 "dma_device_id": "system", 00:17:00.661 "dma_device_type": 1 00:17:00.661 }, 00:17:00.661 { 00:17:00.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.661 "dma_device_type": 2 00:17:00.661 } 00:17:00.661 ], 00:17:00.661 "driver_specific": {} 00:17:00.661 }' 00:17:00.661 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.661 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:00.661 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:00.661 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:00.919 04:17:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:01.177 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:01.177 "name": "BaseBdev2", 00:17:01.177 "aliases": [ 00:17:01.177 "b6740475-e5be-4922-9c15-0b2c9d1e655b" 00:17:01.177 ], 00:17:01.177 "product_name": "Malloc disk", 00:17:01.177 "block_size": 512, 00:17:01.177 "num_blocks": 65536, 00:17:01.177 "uuid": "b6740475-e5be-4922-9c15-0b2c9d1e655b", 00:17:01.177 "assigned_rate_limits": { 00:17:01.177 "rw_ios_per_sec": 0, 00:17:01.177 "rw_mbytes_per_sec": 0, 00:17:01.177 "r_mbytes_per_sec": 0, 00:17:01.177 "w_mbytes_per_sec": 0 00:17:01.177 }, 00:17:01.177 "claimed": true, 00:17:01.177 "claim_type": "exclusive_write", 00:17:01.177 "zoned": false, 00:17:01.177 "supported_io_types": { 00:17:01.177 "read": true, 00:17:01.177 "write": true, 00:17:01.177 "unmap": true, 00:17:01.177 "write_zeroes": true, 00:17:01.177 "flush": true, 00:17:01.177 "reset": true, 00:17:01.177 "compare": false, 00:17:01.177 "compare_and_write": false, 00:17:01.177 "abort": true, 00:17:01.177 "nvme_admin": false, 00:17:01.177 "nvme_io": false 00:17:01.177 }, 00:17:01.177 "memory_domains": [ 00:17:01.177 { 00:17:01.177 "dma_device_id": "system", 00:17:01.177 "dma_device_type": 1 00:17:01.177 }, 00:17:01.177 { 00:17:01.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.177 "dma_device_type": 2 00:17:01.177 } 00:17:01.177 ], 00:17:01.177 "driver_specific": {} 00:17:01.177 }' 00:17:01.177 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.177 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:01.435 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:01.693 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:01.693 "name": "BaseBdev3", 00:17:01.693 "aliases": [ 00:17:01.693 "6e56ab07-1780-438e-aa29-175b5167ebe9" 00:17:01.693 ], 00:17:01.693 "product_name": "Malloc disk", 00:17:01.693 "block_size": 512, 00:17:01.693 "num_blocks": 65536, 00:17:01.693 "uuid": "6e56ab07-1780-438e-aa29-175b5167ebe9", 00:17:01.693 "assigned_rate_limits": { 00:17:01.693 "rw_ios_per_sec": 0, 00:17:01.693 "rw_mbytes_per_sec": 0, 00:17:01.693 "r_mbytes_per_sec": 0, 00:17:01.693 "w_mbytes_per_sec": 0 00:17:01.693 }, 00:17:01.693 "claimed": true, 00:17:01.693 "claim_type": "exclusive_write", 00:17:01.693 "zoned": false, 00:17:01.693 "supported_io_types": { 00:17:01.693 "read": true, 00:17:01.693 "write": true, 00:17:01.693 "unmap": true, 00:17:01.693 "write_zeroes": true, 00:17:01.693 "flush": true, 00:17:01.693 "reset": true, 00:17:01.693 "compare": false, 00:17:01.693 "compare_and_write": false, 00:17:01.693 "abort": true, 00:17:01.693 "nvme_admin": false, 00:17:01.693 "nvme_io": false 00:17:01.693 }, 00:17:01.693 "memory_domains": [ 00:17:01.693 { 00:17:01.693 "dma_device_id": "system", 00:17:01.693 "dma_device_type": 1 00:17:01.693 }, 00:17:01.693 { 00:17:01.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:01.693 "dma_device_type": 2 00:17:01.693 } 00:17:01.693 ], 00:17:01.693 "driver_specific": {} 00:17:01.693 }' 00:17:01.693 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.693 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:01.951 04:17:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:02.209 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:02.209 "name": "BaseBdev4", 00:17:02.209 "aliases": [ 00:17:02.209 "f8c94d46-3476-41dd-8085-2163b5e45a74" 00:17:02.209 ], 00:17:02.209 "product_name": "Malloc disk", 00:17:02.209 "block_size": 512, 00:17:02.209 "num_blocks": 65536, 00:17:02.209 "uuid": "f8c94d46-3476-41dd-8085-2163b5e45a74", 00:17:02.209 "assigned_rate_limits": { 00:17:02.209 "rw_ios_per_sec": 0, 00:17:02.209 "rw_mbytes_per_sec": 0, 00:17:02.209 "r_mbytes_per_sec": 0, 00:17:02.209 "w_mbytes_per_sec": 0 00:17:02.209 }, 00:17:02.209 "claimed": true, 00:17:02.209 "claim_type": "exclusive_write", 00:17:02.209 "zoned": false, 00:17:02.209 "supported_io_types": { 00:17:02.209 "read": true, 00:17:02.209 "write": true, 00:17:02.209 "unmap": true, 00:17:02.209 "write_zeroes": true, 00:17:02.209 "flush": true, 00:17:02.209 "reset": true, 00:17:02.209 "compare": false, 00:17:02.209 "compare_and_write": false, 00:17:02.209 "abort": true, 00:17:02.209 "nvme_admin": false, 00:17:02.209 "nvme_io": false 00:17:02.209 }, 00:17:02.209 "memory_domains": [ 00:17:02.209 { 00:17:02.209 "dma_device_id": "system", 00:17:02.209 "dma_device_type": 1 00:17:02.209 }, 00:17:02.209 { 00:17:02.209 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.209 "dma_device_type": 2 00:17:02.209 } 00:17:02.209 ], 00:17:02.209 "driver_specific": {} 00:17:02.209 }' 00:17:02.209 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:02.467 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:02.725 [2024-05-15 04:17:50.711978] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:02.725 [2024-05-15 04:17:50.712001] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:02.725 [2024-05-15 04:17:50.712065] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:02.725 [2024-05-15 04:17:50.712315] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:02.725 [2024-05-15 04:17:50.712330] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d021d0 name Existed_Raid, state offline 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@342 -- # killprocess 3892335 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@946 -- # '[' -z 3892335 ']' 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # kill -0 3892335 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # uname 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:02.725 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3892335 00:17:02.983 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:02.983 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:02.983 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3892335' 00:17:02.983 killing process with pid 3892335 00:17:02.983 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@965 -- # kill 3892335 00:17:02.984 [2024-05-15 04:17:50.762411] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:02.984 04:17:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@970 -- # wait 3892335 00:17:02.984 [2024-05-15 04:17:50.809227] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@344 -- # return 0 00:17:03.242 00:17:03.242 real 0m31.868s 00:17:03.242 user 0m59.544s 00:17:03.242 sys 0m4.308s 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:03.242 ************************************ 00:17:03.242 END TEST raid_state_function_test 00:17:03.242 ************************************ 00:17:03.242 04:17:51 bdev_raid -- bdev/bdev_raid.sh@804 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:17:03.242 04:17:51 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:17:03.242 04:17:51 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:03.242 04:17:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:03.242 ************************************ 00:17:03.242 START TEST raid_state_function_test_sb 00:17:03.242 ************************************ 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 4 true 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=4 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev3 00:17:03.242 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # echo BaseBdev4 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # raid_pid=3896729 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3896729' 00:17:03.243 Process raid pid: 3896729 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@247 -- # waitforlisten 3896729 /var/tmp/spdk-raid.sock 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3896729 ']' 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:03.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:03.243 04:17:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.243 [2024-05-15 04:17:51.164123] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:17:03.243 [2024-05-15 04:17:51.164187] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:03.243 [2024-05-15 04:17:51.238425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:03.501 [2024-05-15 04:17:51.345904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:03.501 [2024-05-15 04:17:51.412198] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:03.501 [2024-05-15 04:17:51.412231] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # return 0 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:04.435 [2024-05-15 04:17:52.311398] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:04.435 [2024-05-15 04:17:52.311438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:04.435 [2024-05-15 04:17:52.311449] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:04.435 [2024-05-15 04:17:52.311459] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:04.435 [2024-05-15 04:17:52.311467] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:04.435 [2024-05-15 04:17:52.311476] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:04.435 [2024-05-15 04:17:52.311483] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:04.435 [2024-05-15 04:17:52.311493] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.435 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.693 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:04.693 "name": "Existed_Raid", 00:17:04.693 "uuid": "fa92d800-ae55-42a0-a5fe-408064670db8", 00:17:04.693 "strip_size_kb": 0, 00:17:04.693 "state": "configuring", 00:17:04.693 "raid_level": "raid1", 00:17:04.693 "superblock": true, 00:17:04.693 "num_base_bdevs": 4, 00:17:04.693 "num_base_bdevs_discovered": 0, 00:17:04.693 "num_base_bdevs_operational": 4, 00:17:04.693 "base_bdevs_list": [ 00:17:04.693 { 00:17:04.693 "name": "BaseBdev1", 00:17:04.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.693 "is_configured": false, 00:17:04.693 "data_offset": 0, 00:17:04.693 "data_size": 0 00:17:04.693 }, 00:17:04.693 { 00:17:04.693 "name": "BaseBdev2", 00:17:04.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.693 "is_configured": false, 00:17:04.693 "data_offset": 0, 00:17:04.693 "data_size": 0 00:17:04.693 }, 00:17:04.693 { 00:17:04.693 "name": "BaseBdev3", 00:17:04.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.693 "is_configured": false, 00:17:04.693 "data_offset": 0, 00:17:04.693 "data_size": 0 00:17:04.693 }, 00:17:04.693 { 00:17:04.693 "name": "BaseBdev4", 00:17:04.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.693 "is_configured": false, 00:17:04.693 "data_offset": 0, 00:17:04.693 "data_size": 0 00:17:04.693 } 00:17:04.693 ] 00:17:04.693 }' 00:17:04.693 04:17:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:04.693 04:17:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.260 04:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:05.518 [2024-05-15 04:17:53.362045] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:05.518 [2024-05-15 04:17:53.362079] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1f040 name Existed_Raid, state configuring 00:17:05.518 04:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:05.775 [2024-05-15 04:17:53.598694] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:05.775 [2024-05-15 04:17:53.598729] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:05.775 [2024-05-15 04:17:53.598739] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:05.775 [2024-05-15 04:17:53.598749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:05.775 [2024-05-15 04:17:53.598756] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:05.775 [2024-05-15 04:17:53.598766] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:05.775 [2024-05-15 04:17:53.598773] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:05.775 [2024-05-15 04:17:53.598782] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:05.775 04:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:06.032 [2024-05-15 04:17:53.846850] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:06.032 BaseBdev1 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:06.032 04:17:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:06.290 04:17:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:06.548 [ 00:17:06.548 { 00:17:06.548 "name": "BaseBdev1", 00:17:06.548 "aliases": [ 00:17:06.548 "b92ded9d-5b36-403e-84cd-147619946756" 00:17:06.548 ], 00:17:06.548 "product_name": "Malloc disk", 00:17:06.548 "block_size": 512, 00:17:06.548 "num_blocks": 65536, 00:17:06.548 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:06.548 "assigned_rate_limits": { 00:17:06.548 "rw_ios_per_sec": 0, 00:17:06.548 "rw_mbytes_per_sec": 0, 00:17:06.548 "r_mbytes_per_sec": 0, 00:17:06.548 "w_mbytes_per_sec": 0 00:17:06.548 }, 00:17:06.548 "claimed": true, 00:17:06.548 "claim_type": "exclusive_write", 00:17:06.548 "zoned": false, 00:17:06.548 "supported_io_types": { 00:17:06.548 "read": true, 00:17:06.548 "write": true, 00:17:06.548 "unmap": true, 00:17:06.548 "write_zeroes": true, 00:17:06.548 "flush": true, 00:17:06.548 "reset": true, 00:17:06.548 "compare": false, 00:17:06.548 "compare_and_write": false, 00:17:06.548 "abort": true, 00:17:06.548 "nvme_admin": false, 00:17:06.548 "nvme_io": false 00:17:06.548 }, 00:17:06.548 "memory_domains": [ 00:17:06.548 { 00:17:06.548 "dma_device_id": "system", 00:17:06.548 "dma_device_type": 1 00:17:06.548 }, 00:17:06.548 { 00:17:06.548 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.549 "dma_device_type": 2 00:17:06.549 } 00:17:06.549 ], 00:17:06.549 "driver_specific": {} 00:17:06.549 } 00:17:06.549 ] 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.549 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.806 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:06.806 "name": "Existed_Raid", 00:17:06.806 "uuid": "4d11d69c-37b9-4c4d-9d8e-469d3347fcb7", 00:17:06.806 "strip_size_kb": 0, 00:17:06.806 "state": "configuring", 00:17:06.806 "raid_level": "raid1", 00:17:06.806 "superblock": true, 00:17:06.806 "num_base_bdevs": 4, 00:17:06.806 "num_base_bdevs_discovered": 1, 00:17:06.806 "num_base_bdevs_operational": 4, 00:17:06.806 "base_bdevs_list": [ 00:17:06.806 { 00:17:06.806 "name": "BaseBdev1", 00:17:06.806 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:06.806 "is_configured": true, 00:17:06.806 "data_offset": 2048, 00:17:06.806 "data_size": 63488 00:17:06.806 }, 00:17:06.806 { 00:17:06.806 "name": "BaseBdev2", 00:17:06.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.806 "is_configured": false, 00:17:06.806 "data_offset": 0, 00:17:06.806 "data_size": 0 00:17:06.806 }, 00:17:06.806 { 00:17:06.806 "name": "BaseBdev3", 00:17:06.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.806 "is_configured": false, 00:17:06.806 "data_offset": 0, 00:17:06.806 "data_size": 0 00:17:06.806 }, 00:17:06.806 { 00:17:06.806 "name": "BaseBdev4", 00:17:06.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.806 "is_configured": false, 00:17:06.806 "data_offset": 0, 00:17:06.806 "data_size": 0 00:17:06.806 } 00:17:06.806 ] 00:17:06.806 }' 00:17:06.807 04:17:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:06.807 04:17:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:07.372 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:07.372 [2024-05-15 04:17:55.354783] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:07.372 [2024-05-15 04:17:55.354853] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1e8b0 name Existed_Raid, state configuring 00:17:07.372 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:07.629 [2024-05-15 04:17:55.595458] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:07.629 [2024-05-15 04:17:55.596963] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:07.629 [2024-05-15 04:17:55.596997] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:07.629 [2024-05-15 04:17:55.597010] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:07.629 [2024-05-15 04:17:55.597022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:07.629 [2024-05-15 04:17:55.597032] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:07.629 [2024-05-15 04:17:55.597045] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.629 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:07.886 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:07.886 "name": "Existed_Raid", 00:17:07.886 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:07.886 "strip_size_kb": 0, 00:17:07.886 "state": "configuring", 00:17:07.886 "raid_level": "raid1", 00:17:07.886 "superblock": true, 00:17:07.886 "num_base_bdevs": 4, 00:17:07.886 "num_base_bdevs_discovered": 1, 00:17:07.886 "num_base_bdevs_operational": 4, 00:17:07.886 "base_bdevs_list": [ 00:17:07.886 { 00:17:07.886 "name": "BaseBdev1", 00:17:07.886 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:07.886 "is_configured": true, 00:17:07.886 "data_offset": 2048, 00:17:07.886 "data_size": 63488 00:17:07.886 }, 00:17:07.886 { 00:17:07.886 "name": "BaseBdev2", 00:17:07.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.886 "is_configured": false, 00:17:07.886 "data_offset": 0, 00:17:07.886 "data_size": 0 00:17:07.886 }, 00:17:07.886 { 00:17:07.886 "name": "BaseBdev3", 00:17:07.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.886 "is_configured": false, 00:17:07.886 "data_offset": 0, 00:17:07.886 "data_size": 0 00:17:07.886 }, 00:17:07.886 { 00:17:07.886 "name": "BaseBdev4", 00:17:07.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:07.886 "is_configured": false, 00:17:07.886 "data_offset": 0, 00:17:07.886 "data_size": 0 00:17:07.886 } 00:17:07.886 ] 00:17:07.886 }' 00:17:07.886 04:17:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:07.886 04:17:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:08.449 04:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:08.756 [2024-05-15 04:17:56.647430] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:08.756 BaseBdev2 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:08.756 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:09.012 04:17:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:09.267 [ 00:17:09.267 { 00:17:09.267 "name": "BaseBdev2", 00:17:09.267 "aliases": [ 00:17:09.267 "bc0e71f5-6876-485b-bd03-adc3faa2bdb5" 00:17:09.267 ], 00:17:09.267 "product_name": "Malloc disk", 00:17:09.267 "block_size": 512, 00:17:09.267 "num_blocks": 65536, 00:17:09.267 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:09.267 "assigned_rate_limits": { 00:17:09.267 "rw_ios_per_sec": 0, 00:17:09.267 "rw_mbytes_per_sec": 0, 00:17:09.267 "r_mbytes_per_sec": 0, 00:17:09.267 "w_mbytes_per_sec": 0 00:17:09.267 }, 00:17:09.267 "claimed": true, 00:17:09.267 "claim_type": "exclusive_write", 00:17:09.267 "zoned": false, 00:17:09.267 "supported_io_types": { 00:17:09.267 "read": true, 00:17:09.267 "write": true, 00:17:09.267 "unmap": true, 00:17:09.267 "write_zeroes": true, 00:17:09.267 "flush": true, 00:17:09.267 "reset": true, 00:17:09.267 "compare": false, 00:17:09.267 "compare_and_write": false, 00:17:09.267 "abort": true, 00:17:09.267 "nvme_admin": false, 00:17:09.267 "nvme_io": false 00:17:09.267 }, 00:17:09.267 "memory_domains": [ 00:17:09.267 { 00:17:09.267 "dma_device_id": "system", 00:17:09.267 "dma_device_type": 1 00:17:09.267 }, 00:17:09.267 { 00:17:09.267 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.267 "dma_device_type": 2 00:17:09.267 } 00:17:09.267 ], 00:17:09.267 "driver_specific": {} 00:17:09.267 } 00:17:09.267 ] 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.267 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:09.524 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:09.524 "name": "Existed_Raid", 00:17:09.524 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:09.524 "strip_size_kb": 0, 00:17:09.524 "state": "configuring", 00:17:09.524 "raid_level": "raid1", 00:17:09.524 "superblock": true, 00:17:09.524 "num_base_bdevs": 4, 00:17:09.524 "num_base_bdevs_discovered": 2, 00:17:09.525 "num_base_bdevs_operational": 4, 00:17:09.525 "base_bdevs_list": [ 00:17:09.525 { 00:17:09.525 "name": "BaseBdev1", 00:17:09.525 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:09.525 "is_configured": true, 00:17:09.525 "data_offset": 2048, 00:17:09.525 "data_size": 63488 00:17:09.525 }, 00:17:09.525 { 00:17:09.525 "name": "BaseBdev2", 00:17:09.525 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:09.525 "is_configured": true, 00:17:09.525 "data_offset": 2048, 00:17:09.525 "data_size": 63488 00:17:09.525 }, 00:17:09.525 { 00:17:09.525 "name": "BaseBdev3", 00:17:09.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.525 "is_configured": false, 00:17:09.525 "data_offset": 0, 00:17:09.525 "data_size": 0 00:17:09.525 }, 00:17:09.525 { 00:17:09.525 "name": "BaseBdev4", 00:17:09.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.525 "is_configured": false, 00:17:09.525 "data_offset": 0, 00:17:09.525 "data_size": 0 00:17:09.525 } 00:17:09.525 ] 00:17:09.525 }' 00:17:09.525 04:17:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:09.525 04:17:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:10.090 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:10.349 [2024-05-15 04:17:58.265237] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:10.349 BaseBdev3 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev3 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:10.349 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.607 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:10.865 [ 00:17:10.865 { 00:17:10.865 "name": "BaseBdev3", 00:17:10.865 "aliases": [ 00:17:10.865 "3cd7d043-fcae-4c06-a47d-05586fb8c921" 00:17:10.865 ], 00:17:10.865 "product_name": "Malloc disk", 00:17:10.865 "block_size": 512, 00:17:10.865 "num_blocks": 65536, 00:17:10.865 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:10.865 "assigned_rate_limits": { 00:17:10.865 "rw_ios_per_sec": 0, 00:17:10.865 "rw_mbytes_per_sec": 0, 00:17:10.865 "r_mbytes_per_sec": 0, 00:17:10.865 "w_mbytes_per_sec": 0 00:17:10.865 }, 00:17:10.865 "claimed": true, 00:17:10.865 "claim_type": "exclusive_write", 00:17:10.865 "zoned": false, 00:17:10.865 "supported_io_types": { 00:17:10.865 "read": true, 00:17:10.865 "write": true, 00:17:10.865 "unmap": true, 00:17:10.865 "write_zeroes": true, 00:17:10.865 "flush": true, 00:17:10.865 "reset": true, 00:17:10.865 "compare": false, 00:17:10.865 "compare_and_write": false, 00:17:10.865 "abort": true, 00:17:10.865 "nvme_admin": false, 00:17:10.865 "nvme_io": false 00:17:10.865 }, 00:17:10.865 "memory_domains": [ 00:17:10.865 { 00:17:10.865 "dma_device_id": "system", 00:17:10.865 "dma_device_type": 1 00:17:10.865 }, 00:17:10.865 { 00:17:10.865 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.865 "dma_device_type": 2 00:17:10.865 } 00:17:10.865 ], 00:17:10.865 "driver_specific": {} 00:17:10.865 } 00:17:10.865 ] 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:10.865 04:17:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.123 04:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:11.123 "name": "Existed_Raid", 00:17:11.123 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:11.123 "strip_size_kb": 0, 00:17:11.123 "state": "configuring", 00:17:11.123 "raid_level": "raid1", 00:17:11.123 "superblock": true, 00:17:11.123 "num_base_bdevs": 4, 00:17:11.123 "num_base_bdevs_discovered": 3, 00:17:11.123 "num_base_bdevs_operational": 4, 00:17:11.123 "base_bdevs_list": [ 00:17:11.123 { 00:17:11.123 "name": "BaseBdev1", 00:17:11.123 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:11.123 "is_configured": true, 00:17:11.123 "data_offset": 2048, 00:17:11.123 "data_size": 63488 00:17:11.123 }, 00:17:11.123 { 00:17:11.123 "name": "BaseBdev2", 00:17:11.123 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:11.123 "is_configured": true, 00:17:11.123 "data_offset": 2048, 00:17:11.123 "data_size": 63488 00:17:11.123 }, 00:17:11.123 { 00:17:11.123 "name": "BaseBdev3", 00:17:11.123 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:11.123 "is_configured": true, 00:17:11.123 "data_offset": 2048, 00:17:11.123 "data_size": 63488 00:17:11.123 }, 00:17:11.123 { 00:17:11.123 "name": "BaseBdev4", 00:17:11.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.123 "is_configured": false, 00:17:11.123 "data_offset": 0, 00:17:11.123 "data_size": 0 00:17:11.123 } 00:17:11.123 ] 00:17:11.123 }' 00:17:11.123 04:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:11.123 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.689 04:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:11.947 [2024-05-15 04:17:59.919461] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:11.947 [2024-05-15 04:17:59.919731] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f1f7f0 00:17:11.947 [2024-05-15 04:17:59.919750] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:11.947 [2024-05-15 04:17:59.919943] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d35f0 00:17:11.947 [2024-05-15 04:17:59.920079] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f1f7f0 00:17:11.947 [2024-05-15 04:17:59.920093] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f1f7f0 00:17:11.947 [2024-05-15 04:17:59.920215] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:11.947 BaseBdev4 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev4 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:11.947 04:17:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:12.204 04:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:12.462 [ 00:17:12.462 { 00:17:12.462 "name": "BaseBdev4", 00:17:12.462 "aliases": [ 00:17:12.462 "d478592c-c1ed-445f-ab23-217d8ce17d5e" 00:17:12.462 ], 00:17:12.462 "product_name": "Malloc disk", 00:17:12.462 "block_size": 512, 00:17:12.462 "num_blocks": 65536, 00:17:12.462 "uuid": "d478592c-c1ed-445f-ab23-217d8ce17d5e", 00:17:12.462 "assigned_rate_limits": { 00:17:12.462 "rw_ios_per_sec": 0, 00:17:12.462 "rw_mbytes_per_sec": 0, 00:17:12.462 "r_mbytes_per_sec": 0, 00:17:12.462 "w_mbytes_per_sec": 0 00:17:12.462 }, 00:17:12.462 "claimed": true, 00:17:12.462 "claim_type": "exclusive_write", 00:17:12.462 "zoned": false, 00:17:12.462 "supported_io_types": { 00:17:12.462 "read": true, 00:17:12.462 "write": true, 00:17:12.462 "unmap": true, 00:17:12.462 "write_zeroes": true, 00:17:12.462 "flush": true, 00:17:12.462 "reset": true, 00:17:12.462 "compare": false, 00:17:12.462 "compare_and_write": false, 00:17:12.462 "abort": true, 00:17:12.462 "nvme_admin": false, 00:17:12.462 "nvme_io": false 00:17:12.462 }, 00:17:12.462 "memory_domains": [ 00:17:12.462 { 00:17:12.462 "dma_device_id": "system", 00:17:12.462 "dma_device_type": 1 00:17:12.463 }, 00:17:12.463 { 00:17:12.463 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.463 "dma_device_type": 2 00:17:12.463 } 00:17:12.463 ], 00:17:12.463 "driver_specific": {} 00:17:12.463 } 00:17:12.463 ] 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.463 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.028 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:13.028 "name": "Existed_Raid", 00:17:13.028 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:13.028 "strip_size_kb": 0, 00:17:13.028 "state": "online", 00:17:13.028 "raid_level": "raid1", 00:17:13.028 "superblock": true, 00:17:13.028 "num_base_bdevs": 4, 00:17:13.028 "num_base_bdevs_discovered": 4, 00:17:13.028 "num_base_bdevs_operational": 4, 00:17:13.028 "base_bdevs_list": [ 00:17:13.028 { 00:17:13.028 "name": "BaseBdev1", 00:17:13.028 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:13.028 "is_configured": true, 00:17:13.028 "data_offset": 2048, 00:17:13.028 "data_size": 63488 00:17:13.028 }, 00:17:13.028 { 00:17:13.028 "name": "BaseBdev2", 00:17:13.028 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:13.028 "is_configured": true, 00:17:13.028 "data_offset": 2048, 00:17:13.028 "data_size": 63488 00:17:13.028 }, 00:17:13.028 { 00:17:13.028 "name": "BaseBdev3", 00:17:13.028 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:13.028 "is_configured": true, 00:17:13.028 "data_offset": 2048, 00:17:13.028 "data_size": 63488 00:17:13.028 }, 00:17:13.028 { 00:17:13.028 "name": "BaseBdev4", 00:17:13.028 "uuid": "d478592c-c1ed-445f-ab23-217d8ce17d5e", 00:17:13.028 "is_configured": true, 00:17:13.028 "data_offset": 2048, 00:17:13.028 "data_size": 63488 00:17:13.028 } 00:17:13.028 ] 00:17:13.028 }' 00:17:13.028 04:18:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:13.028 04:18:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:13.285 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:13.543 [2024-05-15 04:18:01.528121] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:13.543 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:13.543 "name": "Existed_Raid", 00:17:13.543 "aliases": [ 00:17:13.543 "d09d7d3f-263c-4d1e-96ba-58a67cb3188a" 00:17:13.543 ], 00:17:13.543 "product_name": "Raid Volume", 00:17:13.543 "block_size": 512, 00:17:13.543 "num_blocks": 63488, 00:17:13.543 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:13.543 "assigned_rate_limits": { 00:17:13.543 "rw_ios_per_sec": 0, 00:17:13.543 "rw_mbytes_per_sec": 0, 00:17:13.543 "r_mbytes_per_sec": 0, 00:17:13.543 "w_mbytes_per_sec": 0 00:17:13.543 }, 00:17:13.543 "claimed": false, 00:17:13.543 "zoned": false, 00:17:13.543 "supported_io_types": { 00:17:13.543 "read": true, 00:17:13.543 "write": true, 00:17:13.543 "unmap": false, 00:17:13.543 "write_zeroes": true, 00:17:13.543 "flush": false, 00:17:13.543 "reset": true, 00:17:13.543 "compare": false, 00:17:13.543 "compare_and_write": false, 00:17:13.543 "abort": false, 00:17:13.543 "nvme_admin": false, 00:17:13.543 "nvme_io": false 00:17:13.543 }, 00:17:13.543 "memory_domains": [ 00:17:13.543 { 00:17:13.543 "dma_device_id": "system", 00:17:13.543 "dma_device_type": 1 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.543 "dma_device_type": 2 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "system", 00:17:13.543 "dma_device_type": 1 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.543 "dma_device_type": 2 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "system", 00:17:13.543 "dma_device_type": 1 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.543 "dma_device_type": 2 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "system", 00:17:13.543 "dma_device_type": 1 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.543 "dma_device_type": 2 00:17:13.543 } 00:17:13.543 ], 00:17:13.543 "driver_specific": { 00:17:13.543 "raid": { 00:17:13.543 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:13.543 "strip_size_kb": 0, 00:17:13.543 "state": "online", 00:17:13.543 "raid_level": "raid1", 00:17:13.543 "superblock": true, 00:17:13.543 "num_base_bdevs": 4, 00:17:13.543 "num_base_bdevs_discovered": 4, 00:17:13.543 "num_base_bdevs_operational": 4, 00:17:13.543 "base_bdevs_list": [ 00:17:13.543 { 00:17:13.543 "name": "BaseBdev1", 00:17:13.543 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:13.543 "is_configured": true, 00:17:13.543 "data_offset": 2048, 00:17:13.543 "data_size": 63488 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "name": "BaseBdev2", 00:17:13.543 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:13.543 "is_configured": true, 00:17:13.543 "data_offset": 2048, 00:17:13.543 "data_size": 63488 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "name": "BaseBdev3", 00:17:13.543 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:13.543 "is_configured": true, 00:17:13.543 "data_offset": 2048, 00:17:13.543 "data_size": 63488 00:17:13.543 }, 00:17:13.543 { 00:17:13.543 "name": "BaseBdev4", 00:17:13.543 "uuid": "d478592c-c1ed-445f-ab23-217d8ce17d5e", 00:17:13.543 "is_configured": true, 00:17:13.543 "data_offset": 2048, 00:17:13.543 "data_size": 63488 00:17:13.543 } 00:17:13.543 ] 00:17:13.543 } 00:17:13.543 } 00:17:13.543 }' 00:17:13.543 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:13.800 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:17:13.800 BaseBdev2 00:17:13.800 BaseBdev3 00:17:13.800 BaseBdev4' 00:17:13.800 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:13.800 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:13.800 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:14.058 "name": "BaseBdev1", 00:17:14.058 "aliases": [ 00:17:14.058 "b92ded9d-5b36-403e-84cd-147619946756" 00:17:14.058 ], 00:17:14.058 "product_name": "Malloc disk", 00:17:14.058 "block_size": 512, 00:17:14.058 "num_blocks": 65536, 00:17:14.058 "uuid": "b92ded9d-5b36-403e-84cd-147619946756", 00:17:14.058 "assigned_rate_limits": { 00:17:14.058 "rw_ios_per_sec": 0, 00:17:14.058 "rw_mbytes_per_sec": 0, 00:17:14.058 "r_mbytes_per_sec": 0, 00:17:14.058 "w_mbytes_per_sec": 0 00:17:14.058 }, 00:17:14.058 "claimed": true, 00:17:14.058 "claim_type": "exclusive_write", 00:17:14.058 "zoned": false, 00:17:14.058 "supported_io_types": { 00:17:14.058 "read": true, 00:17:14.058 "write": true, 00:17:14.058 "unmap": true, 00:17:14.058 "write_zeroes": true, 00:17:14.058 "flush": true, 00:17:14.058 "reset": true, 00:17:14.058 "compare": false, 00:17:14.058 "compare_and_write": false, 00:17:14.058 "abort": true, 00:17:14.058 "nvme_admin": false, 00:17:14.058 "nvme_io": false 00:17:14.058 }, 00:17:14.058 "memory_domains": [ 00:17:14.058 { 00:17:14.058 "dma_device_id": "system", 00:17:14.058 "dma_device_type": 1 00:17:14.058 }, 00:17:14.058 { 00:17:14.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.058 "dma_device_type": 2 00:17:14.058 } 00:17:14.058 ], 00:17:14.058 "driver_specific": {} 00:17:14.058 }' 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.058 04:18:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.058 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.058 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.058 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.315 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.315 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:14.315 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:14.315 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:14.315 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:14.573 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:14.573 "name": "BaseBdev2", 00:17:14.573 "aliases": [ 00:17:14.573 "bc0e71f5-6876-485b-bd03-adc3faa2bdb5" 00:17:14.573 ], 00:17:14.573 "product_name": "Malloc disk", 00:17:14.573 "block_size": 512, 00:17:14.573 "num_blocks": 65536, 00:17:14.573 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:14.573 "assigned_rate_limits": { 00:17:14.573 "rw_ios_per_sec": 0, 00:17:14.573 "rw_mbytes_per_sec": 0, 00:17:14.573 "r_mbytes_per_sec": 0, 00:17:14.573 "w_mbytes_per_sec": 0 00:17:14.573 }, 00:17:14.573 "claimed": true, 00:17:14.573 "claim_type": "exclusive_write", 00:17:14.573 "zoned": false, 00:17:14.573 "supported_io_types": { 00:17:14.573 "read": true, 00:17:14.573 "write": true, 00:17:14.573 "unmap": true, 00:17:14.573 "write_zeroes": true, 00:17:14.573 "flush": true, 00:17:14.573 "reset": true, 00:17:14.573 "compare": false, 00:17:14.573 "compare_and_write": false, 00:17:14.573 "abort": true, 00:17:14.573 "nvme_admin": false, 00:17:14.573 "nvme_io": false 00:17:14.573 }, 00:17:14.574 "memory_domains": [ 00:17:14.574 { 00:17:14.574 "dma_device_id": "system", 00:17:14.574 "dma_device_type": 1 00:17:14.574 }, 00:17:14.574 { 00:17:14.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.574 "dma_device_type": 2 00:17:14.574 } 00:17:14.574 ], 00:17:14.574 "driver_specific": {} 00:17:14.574 }' 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.574 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:14.832 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:15.089 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:15.089 "name": "BaseBdev3", 00:17:15.089 "aliases": [ 00:17:15.089 "3cd7d043-fcae-4c06-a47d-05586fb8c921" 00:17:15.089 ], 00:17:15.089 "product_name": "Malloc disk", 00:17:15.089 "block_size": 512, 00:17:15.089 "num_blocks": 65536, 00:17:15.089 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:15.089 "assigned_rate_limits": { 00:17:15.089 "rw_ios_per_sec": 0, 00:17:15.089 "rw_mbytes_per_sec": 0, 00:17:15.089 "r_mbytes_per_sec": 0, 00:17:15.090 "w_mbytes_per_sec": 0 00:17:15.090 }, 00:17:15.090 "claimed": true, 00:17:15.090 "claim_type": "exclusive_write", 00:17:15.090 "zoned": false, 00:17:15.090 "supported_io_types": { 00:17:15.090 "read": true, 00:17:15.090 "write": true, 00:17:15.090 "unmap": true, 00:17:15.090 "write_zeroes": true, 00:17:15.090 "flush": true, 00:17:15.090 "reset": true, 00:17:15.090 "compare": false, 00:17:15.090 "compare_and_write": false, 00:17:15.090 "abort": true, 00:17:15.090 "nvme_admin": false, 00:17:15.090 "nvme_io": false 00:17:15.090 }, 00:17:15.090 "memory_domains": [ 00:17:15.090 { 00:17:15.090 "dma_device_id": "system", 00:17:15.090 "dma_device_type": 1 00:17:15.090 }, 00:17:15.090 { 00:17:15.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.090 "dma_device_type": 2 00:17:15.090 } 00:17:15.090 ], 00:17:15.090 "driver_specific": {} 00:17:15.090 }' 00:17:15.090 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:15.090 04:18:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:15.090 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:15.090 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:15.090 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:15.090 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:15.347 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:15.604 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:15.604 "name": "BaseBdev4", 00:17:15.604 "aliases": [ 00:17:15.604 "d478592c-c1ed-445f-ab23-217d8ce17d5e" 00:17:15.604 ], 00:17:15.604 "product_name": "Malloc disk", 00:17:15.604 "block_size": 512, 00:17:15.604 "num_blocks": 65536, 00:17:15.604 "uuid": "d478592c-c1ed-445f-ab23-217d8ce17d5e", 00:17:15.604 "assigned_rate_limits": { 00:17:15.604 "rw_ios_per_sec": 0, 00:17:15.604 "rw_mbytes_per_sec": 0, 00:17:15.604 "r_mbytes_per_sec": 0, 00:17:15.604 "w_mbytes_per_sec": 0 00:17:15.605 }, 00:17:15.605 "claimed": true, 00:17:15.605 "claim_type": "exclusive_write", 00:17:15.605 "zoned": false, 00:17:15.605 "supported_io_types": { 00:17:15.605 "read": true, 00:17:15.605 "write": true, 00:17:15.605 "unmap": true, 00:17:15.605 "write_zeroes": true, 00:17:15.605 "flush": true, 00:17:15.605 "reset": true, 00:17:15.605 "compare": false, 00:17:15.605 "compare_and_write": false, 00:17:15.605 "abort": true, 00:17:15.605 "nvme_admin": false, 00:17:15.605 "nvme_io": false 00:17:15.605 }, 00:17:15.605 "memory_domains": [ 00:17:15.605 { 00:17:15.605 "dma_device_id": "system", 00:17:15.605 "dma_device_type": 1 00:17:15.605 }, 00:17:15.605 { 00:17:15.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.605 "dma_device_type": 2 00:17:15.605 } 00:17:15.605 ], 00:17:15.605 "driver_specific": {} 00:17:15.605 }' 00:17:15.605 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:15.605 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:15.605 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:15.605 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:15.605 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:15.862 04:18:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:16.119 [2024-05-15 04:18:04.026522] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # local expected_state 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 0 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.119 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.377 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:16.377 "name": "Existed_Raid", 00:17:16.377 "uuid": "d09d7d3f-263c-4d1e-96ba-58a67cb3188a", 00:17:16.377 "strip_size_kb": 0, 00:17:16.377 "state": "online", 00:17:16.377 "raid_level": "raid1", 00:17:16.377 "superblock": true, 00:17:16.377 "num_base_bdevs": 4, 00:17:16.377 "num_base_bdevs_discovered": 3, 00:17:16.377 "num_base_bdevs_operational": 3, 00:17:16.377 "base_bdevs_list": [ 00:17:16.377 { 00:17:16.377 "name": null, 00:17:16.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:16.377 "is_configured": false, 00:17:16.377 "data_offset": 2048, 00:17:16.377 "data_size": 63488 00:17:16.377 }, 00:17:16.377 { 00:17:16.377 "name": "BaseBdev2", 00:17:16.377 "uuid": "bc0e71f5-6876-485b-bd03-adc3faa2bdb5", 00:17:16.377 "is_configured": true, 00:17:16.378 "data_offset": 2048, 00:17:16.378 "data_size": 63488 00:17:16.378 }, 00:17:16.378 { 00:17:16.378 "name": "BaseBdev3", 00:17:16.378 "uuid": "3cd7d043-fcae-4c06-a47d-05586fb8c921", 00:17:16.378 "is_configured": true, 00:17:16.378 "data_offset": 2048, 00:17:16.378 "data_size": 63488 00:17:16.378 }, 00:17:16.378 { 00:17:16.378 "name": "BaseBdev4", 00:17:16.378 "uuid": "d478592c-c1ed-445f-ab23-217d8ce17d5e", 00:17:16.378 "is_configured": true, 00:17:16.378 "data_offset": 2048, 00:17:16.378 "data_size": 63488 00:17:16.378 } 00:17:16.378 ] 00:17:16.378 }' 00:17:16.378 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:16.378 04:18:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:16.943 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:17:16.943 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:16.943 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.943 04:18:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:17.200 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:17.200 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:17.200 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:17.457 [2024-05-15 04:18:05.368952] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:17.457 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:17.457 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:17.457 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.457 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:17.714 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:17.714 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:17.714 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:17.970 [2024-05-15 04:18:05.912626] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:17.970 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:17.970 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:17.970 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.970 04:18:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:17:18.226 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:17:18.226 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:18.226 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:18.482 [2024-05-15 04:18:06.405250] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:18.482 [2024-05-15 04:18:06.405365] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:18.482 [2024-05-15 04:18:06.420134] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:18.482 [2024-05-15 04:18:06.420207] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:18.482 [2024-05-15 04:18:06.420250] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1f7f0 name Existed_Raid, state offline 00:17:18.482 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:17:18.482 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:17:18.482 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.482 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@300 -- # '[' 4 -gt 2 ']' 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i = 1 )) 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:18.739 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:18.997 BaseBdev2 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev2 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:18.997 04:18:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:19.276 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:19.534 [ 00:17:19.534 { 00:17:19.534 "name": "BaseBdev2", 00:17:19.534 "aliases": [ 00:17:19.534 "0e91a598-2bd8-4a65-b827-481d06a7cefd" 00:17:19.534 ], 00:17:19.534 "product_name": "Malloc disk", 00:17:19.534 "block_size": 512, 00:17:19.534 "num_blocks": 65536, 00:17:19.534 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:19.534 "assigned_rate_limits": { 00:17:19.534 "rw_ios_per_sec": 0, 00:17:19.534 "rw_mbytes_per_sec": 0, 00:17:19.534 "r_mbytes_per_sec": 0, 00:17:19.534 "w_mbytes_per_sec": 0 00:17:19.534 }, 00:17:19.534 "claimed": false, 00:17:19.534 "zoned": false, 00:17:19.534 "supported_io_types": { 00:17:19.534 "read": true, 00:17:19.534 "write": true, 00:17:19.534 "unmap": true, 00:17:19.534 "write_zeroes": true, 00:17:19.534 "flush": true, 00:17:19.534 "reset": true, 00:17:19.534 "compare": false, 00:17:19.534 "compare_and_write": false, 00:17:19.534 "abort": true, 00:17:19.534 "nvme_admin": false, 00:17:19.534 "nvme_io": false 00:17:19.534 }, 00:17:19.534 "memory_domains": [ 00:17:19.534 { 00:17:19.534 "dma_device_id": "system", 00:17:19.534 "dma_device_type": 1 00:17:19.534 }, 00:17:19.534 { 00:17:19.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:19.534 "dma_device_type": 2 00:17:19.534 } 00:17:19.534 ], 00:17:19.534 "driver_specific": {} 00:17:19.534 } 00:17:19.534 ] 00:17:19.534 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:19.534 04:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:19.534 04:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:19.534 04:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:19.792 BaseBdev3 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev3 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev3 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:19.792 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.050 04:18:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:20.307 [ 00:17:20.307 { 00:17:20.307 "name": "BaseBdev3", 00:17:20.307 "aliases": [ 00:17:20.307 "5501d796-633d-46ea-acb0-938e60e4a595" 00:17:20.307 ], 00:17:20.307 "product_name": "Malloc disk", 00:17:20.307 "block_size": 512, 00:17:20.307 "num_blocks": 65536, 00:17:20.307 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:20.307 "assigned_rate_limits": { 00:17:20.307 "rw_ios_per_sec": 0, 00:17:20.307 "rw_mbytes_per_sec": 0, 00:17:20.307 "r_mbytes_per_sec": 0, 00:17:20.307 "w_mbytes_per_sec": 0 00:17:20.307 }, 00:17:20.307 "claimed": false, 00:17:20.307 "zoned": false, 00:17:20.307 "supported_io_types": { 00:17:20.307 "read": true, 00:17:20.307 "write": true, 00:17:20.307 "unmap": true, 00:17:20.307 "write_zeroes": true, 00:17:20.307 "flush": true, 00:17:20.307 "reset": true, 00:17:20.307 "compare": false, 00:17:20.307 "compare_and_write": false, 00:17:20.307 "abort": true, 00:17:20.307 "nvme_admin": false, 00:17:20.307 "nvme_io": false 00:17:20.307 }, 00:17:20.307 "memory_domains": [ 00:17:20.307 { 00:17:20.307 "dma_device_id": "system", 00:17:20.307 "dma_device_type": 1 00:17:20.307 }, 00:17:20.307 { 00:17:20.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.307 "dma_device_type": 2 00:17:20.307 } 00:17:20.307 ], 00:17:20.307 "driver_specific": {} 00:17:20.307 } 00:17:20.307 ] 00:17:20.307 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:20.307 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:20.307 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:20.308 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:20.565 BaseBdev4 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@304 -- # waitforbdev BaseBdev4 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev4 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:20.565 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:20.822 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:21.080 [ 00:17:21.080 { 00:17:21.080 "name": "BaseBdev4", 00:17:21.080 "aliases": [ 00:17:21.080 "a5d453e1-22a9-41f5-9ea6-46dbed69cec5" 00:17:21.080 ], 00:17:21.080 "product_name": "Malloc disk", 00:17:21.080 "block_size": 512, 00:17:21.080 "num_blocks": 65536, 00:17:21.080 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:21.080 "assigned_rate_limits": { 00:17:21.080 "rw_ios_per_sec": 0, 00:17:21.080 "rw_mbytes_per_sec": 0, 00:17:21.080 "r_mbytes_per_sec": 0, 00:17:21.080 "w_mbytes_per_sec": 0 00:17:21.080 }, 00:17:21.080 "claimed": false, 00:17:21.080 "zoned": false, 00:17:21.080 "supported_io_types": { 00:17:21.080 "read": true, 00:17:21.080 "write": true, 00:17:21.080 "unmap": true, 00:17:21.080 "write_zeroes": true, 00:17:21.080 "flush": true, 00:17:21.080 "reset": true, 00:17:21.080 "compare": false, 00:17:21.080 "compare_and_write": false, 00:17:21.080 "abort": true, 00:17:21.080 "nvme_admin": false, 00:17:21.080 "nvme_io": false 00:17:21.080 }, 00:17:21.080 "memory_domains": [ 00:17:21.080 { 00:17:21.080 "dma_device_id": "system", 00:17:21.080 "dma_device_type": 1 00:17:21.080 }, 00:17:21.080 { 00:17:21.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.080 "dma_device_type": 2 00:17:21.080 } 00:17:21.080 ], 00:17:21.080 "driver_specific": {} 00:17:21.080 } 00:17:21.080 ] 00:17:21.080 04:18:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:21.080 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i++ )) 00:17:21.080 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # (( i < num_base_bdevs )) 00:17:21.080 04:18:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:21.339 [2024-05-15 04:18:09.152720] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:21.339 [2024-05-15 04:18:09.152761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:21.339 [2024-05-15 04:18:09.152786] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:21.339 [2024-05-15 04:18:09.154056] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:21.339 [2024-05-15 04:18:09.154116] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@307 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.339 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.597 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:21.597 "name": "Existed_Raid", 00:17:21.597 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:21.597 "strip_size_kb": 0, 00:17:21.597 "state": "configuring", 00:17:21.597 "raid_level": "raid1", 00:17:21.597 "superblock": true, 00:17:21.597 "num_base_bdevs": 4, 00:17:21.597 "num_base_bdevs_discovered": 3, 00:17:21.597 "num_base_bdevs_operational": 4, 00:17:21.597 "base_bdevs_list": [ 00:17:21.597 { 00:17:21.597 "name": "BaseBdev1", 00:17:21.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.597 "is_configured": false, 00:17:21.597 "data_offset": 0, 00:17:21.597 "data_size": 0 00:17:21.597 }, 00:17:21.597 { 00:17:21.597 "name": "BaseBdev2", 00:17:21.597 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:21.597 "is_configured": true, 00:17:21.597 "data_offset": 2048, 00:17:21.597 "data_size": 63488 00:17:21.597 }, 00:17:21.597 { 00:17:21.597 "name": "BaseBdev3", 00:17:21.597 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:21.597 "is_configured": true, 00:17:21.597 "data_offset": 2048, 00:17:21.597 "data_size": 63488 00:17:21.597 }, 00:17:21.597 { 00:17:21.597 "name": "BaseBdev4", 00:17:21.597 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:21.597 "is_configured": true, 00:17:21.597 "data_offset": 2048, 00:17:21.597 "data_size": 63488 00:17:21.597 } 00:17:21.597 ] 00:17:21.597 }' 00:17:21.597 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:21.597 04:18:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:22.163 04:18:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:22.421 [2024-05-15 04:18:10.203481] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.421 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.678 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:22.678 "name": "Existed_Raid", 00:17:22.678 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:22.678 "strip_size_kb": 0, 00:17:22.678 "state": "configuring", 00:17:22.678 "raid_level": "raid1", 00:17:22.678 "superblock": true, 00:17:22.678 "num_base_bdevs": 4, 00:17:22.678 "num_base_bdevs_discovered": 2, 00:17:22.678 "num_base_bdevs_operational": 4, 00:17:22.678 "base_bdevs_list": [ 00:17:22.678 { 00:17:22.679 "name": "BaseBdev1", 00:17:22.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:22.679 "is_configured": false, 00:17:22.679 "data_offset": 0, 00:17:22.679 "data_size": 0 00:17:22.679 }, 00:17:22.679 { 00:17:22.679 "name": null, 00:17:22.679 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:22.679 "is_configured": false, 00:17:22.679 "data_offset": 2048, 00:17:22.679 "data_size": 63488 00:17:22.679 }, 00:17:22.679 { 00:17:22.679 "name": "BaseBdev3", 00:17:22.679 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:22.679 "is_configured": true, 00:17:22.679 "data_offset": 2048, 00:17:22.679 "data_size": 63488 00:17:22.679 }, 00:17:22.679 { 00:17:22.679 "name": "BaseBdev4", 00:17:22.679 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:22.679 "is_configured": true, 00:17:22.679 "data_offset": 2048, 00:17:22.679 "data_size": 63488 00:17:22.679 } 00:17:22.679 ] 00:17:22.679 }' 00:17:22.679 04:18:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:22.679 04:18:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.244 04:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.244 04:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@311 -- # [[ false == \f\a\l\s\e ]] 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:23.501 [2024-05-15 04:18:11.492668] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:23.501 BaseBdev1 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # waitforbdev BaseBdev1 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:23.501 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:23.759 04:18:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:24.016 [ 00:17:24.016 { 00:17:24.016 "name": "BaseBdev1", 00:17:24.016 "aliases": [ 00:17:24.017 "eac2edaa-e7e5-4f01-ba9a-b13db428f20b" 00:17:24.017 ], 00:17:24.017 "product_name": "Malloc disk", 00:17:24.017 "block_size": 512, 00:17:24.017 "num_blocks": 65536, 00:17:24.017 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:24.017 "assigned_rate_limits": { 00:17:24.017 "rw_ios_per_sec": 0, 00:17:24.017 "rw_mbytes_per_sec": 0, 00:17:24.017 "r_mbytes_per_sec": 0, 00:17:24.017 "w_mbytes_per_sec": 0 00:17:24.017 }, 00:17:24.017 "claimed": true, 00:17:24.017 "claim_type": "exclusive_write", 00:17:24.017 "zoned": false, 00:17:24.017 "supported_io_types": { 00:17:24.017 "read": true, 00:17:24.017 "write": true, 00:17:24.017 "unmap": true, 00:17:24.017 "write_zeroes": true, 00:17:24.017 "flush": true, 00:17:24.017 "reset": true, 00:17:24.017 "compare": false, 00:17:24.017 "compare_and_write": false, 00:17:24.017 "abort": true, 00:17:24.017 "nvme_admin": false, 00:17:24.017 "nvme_io": false 00:17:24.017 }, 00:17:24.017 "memory_domains": [ 00:17:24.017 { 00:17:24.017 "dma_device_id": "system", 00:17:24.017 "dma_device_type": 1 00:17:24.017 }, 00:17:24.017 { 00:17:24.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.017 "dma_device_type": 2 00:17:24.017 } 00:17:24.017 ], 00:17:24.017 "driver_specific": {} 00:17:24.017 } 00:17:24.017 ] 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.017 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:24.275 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:24.275 "name": "Existed_Raid", 00:17:24.275 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:24.275 "strip_size_kb": 0, 00:17:24.275 "state": "configuring", 00:17:24.275 "raid_level": "raid1", 00:17:24.275 "superblock": true, 00:17:24.275 "num_base_bdevs": 4, 00:17:24.275 "num_base_bdevs_discovered": 3, 00:17:24.275 "num_base_bdevs_operational": 4, 00:17:24.275 "base_bdevs_list": [ 00:17:24.275 { 00:17:24.275 "name": "BaseBdev1", 00:17:24.275 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:24.275 "is_configured": true, 00:17:24.275 "data_offset": 2048, 00:17:24.275 "data_size": 63488 00:17:24.275 }, 00:17:24.275 { 00:17:24.275 "name": null, 00:17:24.275 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:24.275 "is_configured": false, 00:17:24.275 "data_offset": 2048, 00:17:24.275 "data_size": 63488 00:17:24.275 }, 00:17:24.275 { 00:17:24.275 "name": "BaseBdev3", 00:17:24.275 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:24.275 "is_configured": true, 00:17:24.275 "data_offset": 2048, 00:17:24.275 "data_size": 63488 00:17:24.275 }, 00:17:24.275 { 00:17:24.275 "name": "BaseBdev4", 00:17:24.275 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:24.275 "is_configured": true, 00:17:24.275 "data_offset": 2048, 00:17:24.275 "data_size": 63488 00:17:24.275 } 00:17:24.275 ] 00:17:24.275 }' 00:17:24.275 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:24.275 04:18:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:24.840 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.840 04:18:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:25.097 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@316 -- # [[ true == \t\r\u\e ]] 00:17:25.097 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:25.356 [2024-05-15 04:18:13.321529] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:25.356 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.613 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:25.613 "name": "Existed_Raid", 00:17:25.613 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:25.613 "strip_size_kb": 0, 00:17:25.613 "state": "configuring", 00:17:25.613 "raid_level": "raid1", 00:17:25.613 "superblock": true, 00:17:25.613 "num_base_bdevs": 4, 00:17:25.613 "num_base_bdevs_discovered": 2, 00:17:25.613 "num_base_bdevs_operational": 4, 00:17:25.613 "base_bdevs_list": [ 00:17:25.613 { 00:17:25.613 "name": "BaseBdev1", 00:17:25.613 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:25.613 "is_configured": true, 00:17:25.613 "data_offset": 2048, 00:17:25.613 "data_size": 63488 00:17:25.613 }, 00:17:25.613 { 00:17:25.613 "name": null, 00:17:25.613 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:25.613 "is_configured": false, 00:17:25.613 "data_offset": 2048, 00:17:25.613 "data_size": 63488 00:17:25.613 }, 00:17:25.613 { 00:17:25.613 "name": null, 00:17:25.613 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:25.613 "is_configured": false, 00:17:25.613 "data_offset": 2048, 00:17:25.613 "data_size": 63488 00:17:25.613 }, 00:17:25.613 { 00:17:25.613 "name": "BaseBdev4", 00:17:25.613 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:25.613 "is_configured": true, 00:17:25.613 "data_offset": 2048, 00:17:25.613 "data_size": 63488 00:17:25.613 } 00:17:25.613 ] 00:17:25.613 }' 00:17:25.613 04:18:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:25.613 04:18:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.178 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.178 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:26.436 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@320 -- # [[ false == \f\a\l\s\e ]] 00:17:26.436 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:26.694 [2024-05-15 04:18:14.580891] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.694 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.952 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:26.952 "name": "Existed_Raid", 00:17:26.952 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:26.952 "strip_size_kb": 0, 00:17:26.952 "state": "configuring", 00:17:26.952 "raid_level": "raid1", 00:17:26.952 "superblock": true, 00:17:26.952 "num_base_bdevs": 4, 00:17:26.952 "num_base_bdevs_discovered": 3, 00:17:26.952 "num_base_bdevs_operational": 4, 00:17:26.952 "base_bdevs_list": [ 00:17:26.952 { 00:17:26.952 "name": "BaseBdev1", 00:17:26.952 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:26.952 "is_configured": true, 00:17:26.952 "data_offset": 2048, 00:17:26.952 "data_size": 63488 00:17:26.952 }, 00:17:26.952 { 00:17:26.952 "name": null, 00:17:26.952 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:26.952 "is_configured": false, 00:17:26.952 "data_offset": 2048, 00:17:26.952 "data_size": 63488 00:17:26.952 }, 00:17:26.952 { 00:17:26.952 "name": "BaseBdev3", 00:17:26.952 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:26.952 "is_configured": true, 00:17:26.952 "data_offset": 2048, 00:17:26.952 "data_size": 63488 00:17:26.952 }, 00:17:26.952 { 00:17:26.952 "name": "BaseBdev4", 00:17:26.952 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:26.952 "is_configured": true, 00:17:26.952 "data_offset": 2048, 00:17:26.952 "data_size": 63488 00:17:26.952 } 00:17:26.952 ] 00:17:26.952 }' 00:17:26.952 04:18:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:26.952 04:18:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:27.517 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.517 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:27.774 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@324 -- # [[ true == \t\r\u\e ]] 00:17:27.774 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:28.032 [2024-05-15 04:18:15.828202] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.032 04:18:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.291 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:28.291 "name": "Existed_Raid", 00:17:28.291 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:28.291 "strip_size_kb": 0, 00:17:28.291 "state": "configuring", 00:17:28.291 "raid_level": "raid1", 00:17:28.291 "superblock": true, 00:17:28.291 "num_base_bdevs": 4, 00:17:28.291 "num_base_bdevs_discovered": 2, 00:17:28.291 "num_base_bdevs_operational": 4, 00:17:28.291 "base_bdevs_list": [ 00:17:28.291 { 00:17:28.291 "name": null, 00:17:28.291 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:28.291 "is_configured": false, 00:17:28.291 "data_offset": 2048, 00:17:28.291 "data_size": 63488 00:17:28.291 }, 00:17:28.291 { 00:17:28.291 "name": null, 00:17:28.291 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:28.291 "is_configured": false, 00:17:28.291 "data_offset": 2048, 00:17:28.291 "data_size": 63488 00:17:28.291 }, 00:17:28.291 { 00:17:28.291 "name": "BaseBdev3", 00:17:28.291 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:28.291 "is_configured": true, 00:17:28.291 "data_offset": 2048, 00:17:28.291 "data_size": 63488 00:17:28.291 }, 00:17:28.291 { 00:17:28.291 "name": "BaseBdev4", 00:17:28.291 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:28.291 "is_configured": true, 00:17:28.291 "data_offset": 2048, 00:17:28.291 "data_size": 63488 00:17:28.291 } 00:17:28.291 ] 00:17:28.291 }' 00:17:28.291 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:28.291 04:18:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:28.856 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.856 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:29.114 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@328 -- # [[ false == \f\a\l\s\e ]] 00:17:29.114 04:18:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:29.372 [2024-05-15 04:18:17.180714] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.372 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:29.630 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:29.630 "name": "Existed_Raid", 00:17:29.630 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:29.630 "strip_size_kb": 0, 00:17:29.630 "state": "configuring", 00:17:29.630 "raid_level": "raid1", 00:17:29.630 "superblock": true, 00:17:29.630 "num_base_bdevs": 4, 00:17:29.630 "num_base_bdevs_discovered": 3, 00:17:29.630 "num_base_bdevs_operational": 4, 00:17:29.630 "base_bdevs_list": [ 00:17:29.630 { 00:17:29.630 "name": null, 00:17:29.630 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:29.630 "is_configured": false, 00:17:29.630 "data_offset": 2048, 00:17:29.630 "data_size": 63488 00:17:29.630 }, 00:17:29.630 { 00:17:29.630 "name": "BaseBdev2", 00:17:29.630 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:29.630 "is_configured": true, 00:17:29.630 "data_offset": 2048, 00:17:29.630 "data_size": 63488 00:17:29.630 }, 00:17:29.630 { 00:17:29.630 "name": "BaseBdev3", 00:17:29.630 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:29.630 "is_configured": true, 00:17:29.630 "data_offset": 2048, 00:17:29.630 "data_size": 63488 00:17:29.630 }, 00:17:29.630 { 00:17:29.630 "name": "BaseBdev4", 00:17:29.630 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:29.630 "is_configured": true, 00:17:29.630 "data_offset": 2048, 00:17:29.630 "data_size": 63488 00:17:29.630 } 00:17:29.630 ] 00:17:29.630 }' 00:17:29.630 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:29.630 04:18:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:30.196 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.196 04:18:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:30.196 04:18:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@332 -- # [[ true == \t\r\u\e ]] 00:17:30.196 04:18:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.196 04:18:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:30.762 04:18:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u eac2edaa-e7e5-4f01-ba9a-b13db428f20b 00:17:30.762 [2024-05-15 04:18:18.774290] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:30.762 [2024-05-15 04:18:18.774531] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f1f4f0 00:17:30.762 [2024-05-15 04:18:18.774548] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:30.762 [2024-05-15 04:18:18.774708] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20cf440 00:17:30.762 [2024-05-15 04:18:18.774905] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f1f4f0 00:17:30.762 [2024-05-15 04:18:18.774921] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f1f4f0 00:17:30.762 [2024-05-15 04:18:18.775020] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:30.762 NewBaseBdev 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # waitforbdev NewBaseBdev 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@895 -- # local bdev_name=NewBaseBdev 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local i 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:17:31.020 04:18:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:31.278 04:18:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:31.536 [ 00:17:31.536 { 00:17:31.536 "name": "NewBaseBdev", 00:17:31.536 "aliases": [ 00:17:31.536 "eac2edaa-e7e5-4f01-ba9a-b13db428f20b" 00:17:31.536 ], 00:17:31.536 "product_name": "Malloc disk", 00:17:31.536 "block_size": 512, 00:17:31.536 "num_blocks": 65536, 00:17:31.536 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:31.536 "assigned_rate_limits": { 00:17:31.536 "rw_ios_per_sec": 0, 00:17:31.536 "rw_mbytes_per_sec": 0, 00:17:31.536 "r_mbytes_per_sec": 0, 00:17:31.536 "w_mbytes_per_sec": 0 00:17:31.536 }, 00:17:31.536 "claimed": true, 00:17:31.536 "claim_type": "exclusive_write", 00:17:31.536 "zoned": false, 00:17:31.536 "supported_io_types": { 00:17:31.536 "read": true, 00:17:31.536 "write": true, 00:17:31.536 "unmap": true, 00:17:31.536 "write_zeroes": true, 00:17:31.536 "flush": true, 00:17:31.536 "reset": true, 00:17:31.536 "compare": false, 00:17:31.536 "compare_and_write": false, 00:17:31.536 "abort": true, 00:17:31.536 "nvme_admin": false, 00:17:31.536 "nvme_io": false 00:17:31.536 }, 00:17:31.536 "memory_domains": [ 00:17:31.536 { 00:17:31.536 "dma_device_id": "system", 00:17:31.536 "dma_device_type": 1 00:17:31.536 }, 00:17:31.536 { 00:17:31.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.536 "dma_device_type": 2 00:17:31.536 } 00:17:31.536 ], 00:17:31.536 "driver_specific": {} 00:17:31.536 } 00:17:31.536 ] 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # return 0 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.536 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:31.794 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:31.794 "name": "Existed_Raid", 00:17:31.794 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:31.794 "strip_size_kb": 0, 00:17:31.794 "state": "online", 00:17:31.794 "raid_level": "raid1", 00:17:31.794 "superblock": true, 00:17:31.794 "num_base_bdevs": 4, 00:17:31.794 "num_base_bdevs_discovered": 4, 00:17:31.794 "num_base_bdevs_operational": 4, 00:17:31.794 "base_bdevs_list": [ 00:17:31.794 { 00:17:31.794 "name": "NewBaseBdev", 00:17:31.794 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:31.794 "is_configured": true, 00:17:31.794 "data_offset": 2048, 00:17:31.794 "data_size": 63488 00:17:31.794 }, 00:17:31.794 { 00:17:31.794 "name": "BaseBdev2", 00:17:31.794 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:31.794 "is_configured": true, 00:17:31.794 "data_offset": 2048, 00:17:31.794 "data_size": 63488 00:17:31.794 }, 00:17:31.794 { 00:17:31.794 "name": "BaseBdev3", 00:17:31.794 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:31.794 "is_configured": true, 00:17:31.794 "data_offset": 2048, 00:17:31.794 "data_size": 63488 00:17:31.794 }, 00:17:31.794 { 00:17:31.794 "name": "BaseBdev4", 00:17:31.794 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:31.794 "is_configured": true, 00:17:31.794 "data_offset": 2048, 00:17:31.794 "data_size": 63488 00:17:31.794 } 00:17:31.794 ] 00:17:31.794 }' 00:17:31.794 04:18:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:31.794 04:18:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@337 -- # verify_raid_bdev_properties Existed_Raid 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@199 -- # local name 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:32.360 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:32.360 [2024-05-15 04:18:20.362729] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:32.618 "name": "Existed_Raid", 00:17:32.618 "aliases": [ 00:17:32.618 "73bb55da-2e17-42c0-84d5-c385bd6b4bac" 00:17:32.618 ], 00:17:32.618 "product_name": "Raid Volume", 00:17:32.618 "block_size": 512, 00:17:32.618 "num_blocks": 63488, 00:17:32.618 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:32.618 "assigned_rate_limits": { 00:17:32.618 "rw_ios_per_sec": 0, 00:17:32.618 "rw_mbytes_per_sec": 0, 00:17:32.618 "r_mbytes_per_sec": 0, 00:17:32.618 "w_mbytes_per_sec": 0 00:17:32.618 }, 00:17:32.618 "claimed": false, 00:17:32.618 "zoned": false, 00:17:32.618 "supported_io_types": { 00:17:32.618 "read": true, 00:17:32.618 "write": true, 00:17:32.618 "unmap": false, 00:17:32.618 "write_zeroes": true, 00:17:32.618 "flush": false, 00:17:32.618 "reset": true, 00:17:32.618 "compare": false, 00:17:32.618 "compare_and_write": false, 00:17:32.618 "abort": false, 00:17:32.618 "nvme_admin": false, 00:17:32.618 "nvme_io": false 00:17:32.618 }, 00:17:32.618 "memory_domains": [ 00:17:32.618 { 00:17:32.618 "dma_device_id": "system", 00:17:32.618 "dma_device_type": 1 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.618 "dma_device_type": 2 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "system", 00:17:32.618 "dma_device_type": 1 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.618 "dma_device_type": 2 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "system", 00:17:32.618 "dma_device_type": 1 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.618 "dma_device_type": 2 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "system", 00:17:32.618 "dma_device_type": 1 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.618 "dma_device_type": 2 00:17:32.618 } 00:17:32.618 ], 00:17:32.618 "driver_specific": { 00:17:32.618 "raid": { 00:17:32.618 "uuid": "73bb55da-2e17-42c0-84d5-c385bd6b4bac", 00:17:32.618 "strip_size_kb": 0, 00:17:32.618 "state": "online", 00:17:32.618 "raid_level": "raid1", 00:17:32.618 "superblock": true, 00:17:32.618 "num_base_bdevs": 4, 00:17:32.618 "num_base_bdevs_discovered": 4, 00:17:32.618 "num_base_bdevs_operational": 4, 00:17:32.618 "base_bdevs_list": [ 00:17:32.618 { 00:17:32.618 "name": "NewBaseBdev", 00:17:32.618 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:32.618 "is_configured": true, 00:17:32.618 "data_offset": 2048, 00:17:32.618 "data_size": 63488 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "name": "BaseBdev2", 00:17:32.618 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:32.618 "is_configured": true, 00:17:32.618 "data_offset": 2048, 00:17:32.618 "data_size": 63488 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "name": "BaseBdev3", 00:17:32.618 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:32.618 "is_configured": true, 00:17:32.618 "data_offset": 2048, 00:17:32.618 "data_size": 63488 00:17:32.618 }, 00:17:32.618 { 00:17:32.618 "name": "BaseBdev4", 00:17:32.618 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:32.618 "is_configured": true, 00:17:32.618 "data_offset": 2048, 00:17:32.618 "data_size": 63488 00:17:32.618 } 00:17:32.618 ] 00:17:32.618 } 00:17:32.618 } 00:17:32.618 }' 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@202 -- # base_bdev_names='NewBaseBdev 00:17:32.618 BaseBdev2 00:17:32.618 BaseBdev3 00:17:32.618 BaseBdev4' 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:32.618 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:32.876 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:32.876 "name": "NewBaseBdev", 00:17:32.876 "aliases": [ 00:17:32.876 "eac2edaa-e7e5-4f01-ba9a-b13db428f20b" 00:17:32.876 ], 00:17:32.876 "product_name": "Malloc disk", 00:17:32.876 "block_size": 512, 00:17:32.876 "num_blocks": 65536, 00:17:32.876 "uuid": "eac2edaa-e7e5-4f01-ba9a-b13db428f20b", 00:17:32.876 "assigned_rate_limits": { 00:17:32.876 "rw_ios_per_sec": 0, 00:17:32.876 "rw_mbytes_per_sec": 0, 00:17:32.876 "r_mbytes_per_sec": 0, 00:17:32.876 "w_mbytes_per_sec": 0 00:17:32.876 }, 00:17:32.876 "claimed": true, 00:17:32.876 "claim_type": "exclusive_write", 00:17:32.876 "zoned": false, 00:17:32.876 "supported_io_types": { 00:17:32.876 "read": true, 00:17:32.876 "write": true, 00:17:32.876 "unmap": true, 00:17:32.876 "write_zeroes": true, 00:17:32.876 "flush": true, 00:17:32.876 "reset": true, 00:17:32.876 "compare": false, 00:17:32.876 "compare_and_write": false, 00:17:32.876 "abort": true, 00:17:32.876 "nvme_admin": false, 00:17:32.876 "nvme_io": false 00:17:32.877 }, 00:17:32.877 "memory_domains": [ 00:17:32.877 { 00:17:32.877 "dma_device_id": "system", 00:17:32.877 "dma_device_type": 1 00:17:32.877 }, 00:17:32.877 { 00:17:32.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.877 "dma_device_type": 2 00:17:32.877 } 00:17:32.877 ], 00:17:32.877 "driver_specific": {} 00:17:32.877 }' 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:32.877 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.134 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:33.135 04:18:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:33.393 "name": "BaseBdev2", 00:17:33.393 "aliases": [ 00:17:33.393 "0e91a598-2bd8-4a65-b827-481d06a7cefd" 00:17:33.393 ], 00:17:33.393 "product_name": "Malloc disk", 00:17:33.393 "block_size": 512, 00:17:33.393 "num_blocks": 65536, 00:17:33.393 "uuid": "0e91a598-2bd8-4a65-b827-481d06a7cefd", 00:17:33.393 "assigned_rate_limits": { 00:17:33.393 "rw_ios_per_sec": 0, 00:17:33.393 "rw_mbytes_per_sec": 0, 00:17:33.393 "r_mbytes_per_sec": 0, 00:17:33.393 "w_mbytes_per_sec": 0 00:17:33.393 }, 00:17:33.393 "claimed": true, 00:17:33.393 "claim_type": "exclusive_write", 00:17:33.393 "zoned": false, 00:17:33.393 "supported_io_types": { 00:17:33.393 "read": true, 00:17:33.393 "write": true, 00:17:33.393 "unmap": true, 00:17:33.393 "write_zeroes": true, 00:17:33.393 "flush": true, 00:17:33.393 "reset": true, 00:17:33.393 "compare": false, 00:17:33.393 "compare_and_write": false, 00:17:33.393 "abort": true, 00:17:33.393 "nvme_admin": false, 00:17:33.393 "nvme_io": false 00:17:33.393 }, 00:17:33.393 "memory_domains": [ 00:17:33.393 { 00:17:33.393 "dma_device_id": "system", 00:17:33.393 "dma_device_type": 1 00:17:33.393 }, 00:17:33.393 { 00:17:33.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.393 "dma_device_type": 2 00:17:33.393 } 00:17:33.393 ], 00:17:33.393 "driver_specific": {} 00:17:33.393 }' 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.393 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:33.652 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:33.910 "name": "BaseBdev3", 00:17:33.910 "aliases": [ 00:17:33.910 "5501d796-633d-46ea-acb0-938e60e4a595" 00:17:33.910 ], 00:17:33.910 "product_name": "Malloc disk", 00:17:33.910 "block_size": 512, 00:17:33.910 "num_blocks": 65536, 00:17:33.910 "uuid": "5501d796-633d-46ea-acb0-938e60e4a595", 00:17:33.910 "assigned_rate_limits": { 00:17:33.910 "rw_ios_per_sec": 0, 00:17:33.910 "rw_mbytes_per_sec": 0, 00:17:33.910 "r_mbytes_per_sec": 0, 00:17:33.910 "w_mbytes_per_sec": 0 00:17:33.910 }, 00:17:33.910 "claimed": true, 00:17:33.910 "claim_type": "exclusive_write", 00:17:33.910 "zoned": false, 00:17:33.910 "supported_io_types": { 00:17:33.910 "read": true, 00:17:33.910 "write": true, 00:17:33.910 "unmap": true, 00:17:33.910 "write_zeroes": true, 00:17:33.910 "flush": true, 00:17:33.910 "reset": true, 00:17:33.910 "compare": false, 00:17:33.910 "compare_and_write": false, 00:17:33.910 "abort": true, 00:17:33.910 "nvme_admin": false, 00:17:33.910 "nvme_io": false 00:17:33.910 }, 00:17:33.910 "memory_domains": [ 00:17:33.910 { 00:17:33.910 "dma_device_id": "system", 00:17:33.910 "dma_device_type": 1 00:17:33.910 }, 00:17:33.910 { 00:17:33.910 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.910 "dma_device_type": 2 00:17:33.910 } 00:17:33.910 ], 00:17:33.910 "driver_specific": {} 00:17:33.910 }' 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.910 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:34.168 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:34.168 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.168 04:18:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:34.168 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:34.168 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:34.168 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:34.168 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:34.168 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:34.426 "name": "BaseBdev4", 00:17:34.426 "aliases": [ 00:17:34.426 "a5d453e1-22a9-41f5-9ea6-46dbed69cec5" 00:17:34.426 ], 00:17:34.426 "product_name": "Malloc disk", 00:17:34.426 "block_size": 512, 00:17:34.426 "num_blocks": 65536, 00:17:34.426 "uuid": "a5d453e1-22a9-41f5-9ea6-46dbed69cec5", 00:17:34.426 "assigned_rate_limits": { 00:17:34.426 "rw_ios_per_sec": 0, 00:17:34.426 "rw_mbytes_per_sec": 0, 00:17:34.426 "r_mbytes_per_sec": 0, 00:17:34.426 "w_mbytes_per_sec": 0 00:17:34.426 }, 00:17:34.426 "claimed": true, 00:17:34.426 "claim_type": "exclusive_write", 00:17:34.426 "zoned": false, 00:17:34.426 "supported_io_types": { 00:17:34.426 "read": true, 00:17:34.426 "write": true, 00:17:34.426 "unmap": true, 00:17:34.426 "write_zeroes": true, 00:17:34.426 "flush": true, 00:17:34.426 "reset": true, 00:17:34.426 "compare": false, 00:17:34.426 "compare_and_write": false, 00:17:34.426 "abort": true, 00:17:34.426 "nvme_admin": false, 00:17:34.426 "nvme_io": false 00:17:34.426 }, 00:17:34.426 "memory_domains": [ 00:17:34.426 { 00:17:34.426 "dma_device_id": "system", 00:17:34.426 "dma_device_type": 1 00:17:34.426 }, 00:17:34.426 { 00:17:34.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:34.426 "dma_device_type": 2 00:17:34.426 } 00:17:34.426 ], 00:17:34.426 "driver_specific": {} 00:17:34.426 }' 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:34.426 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:34.684 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@339 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:34.942 [2024-05-15 04:18:22.825175] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:34.942 [2024-05-15 04:18:22.825205] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:34.942 [2024-05-15 04:18:22.825282] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:34.942 [2024-05-15 04:18:22.825521] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:34.942 [2024-05-15 04:18:22.825536] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1f4f0 name Existed_Raid, state offline 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@342 -- # killprocess 3896729 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3896729 ']' 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # kill -0 3896729 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # uname 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3896729 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3896729' 00:17:34.942 killing process with pid 3896729 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@965 -- # kill 3896729 00:17:34.942 [2024-05-15 04:18:22.866983] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:34.942 04:18:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@970 -- # wait 3896729 00:17:34.942 [2024-05-15 04:18:22.914916] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:35.200 04:18:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@344 -- # return 0 00:17:35.200 00:17:35.200 real 0m32.067s 00:17:35.200 user 0m59.892s 00:17:35.200 sys 0m4.326s 00:17:35.200 04:18:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:17:35.200 04:18:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:35.200 ************************************ 00:17:35.200 END TEST raid_state_function_test_sb 00:17:35.200 ************************************ 00:17:35.200 04:18:23 bdev_raid -- bdev/bdev_raid.sh@805 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:17:35.200 04:18:23 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:17:35.200 04:18:23 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:17:35.200 04:18:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:35.458 ************************************ 00:17:35.458 START TEST raid_superblock_test 00:17:35.458 ************************************ 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 4 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=4 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # raid_pid=3901856 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # waitforlisten 3901856 /var/tmp/spdk-raid.sock 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@827 -- # '[' -z 3901856 ']' 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:35.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:17:35.458 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.458 [2024-05-15 04:18:23.296595] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:17:35.458 [2024-05-15 04:18:23.296661] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3901856 ] 00:17:35.458 [2024-05-15 04:18:23.371488] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:35.716 [2024-05-15 04:18:23.480550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:17:35.716 [2024-05-15 04:18:23.545445] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.716 [2024-05-15 04:18:23.545484] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # return 0 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:35.716 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:35.973 malloc1 00:17:35.973 04:18:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:36.231 [2024-05-15 04:18:24.081297] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:36.231 [2024-05-15 04:18:24.081355] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.231 [2024-05-15 04:18:24.081386] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213ec20 00:17:36.231 [2024-05-15 04:18:24.081400] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.231 [2024-05-15 04:18:24.083030] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.231 [2024-05-15 04:18:24.083055] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:36.231 pt1 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:36.231 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:36.489 malloc2 00:17:36.489 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:36.746 [2024-05-15 04:18:24.578478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:36.746 [2024-05-15 04:18:24.578536] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.746 [2024-05-15 04:18:24.578561] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2136c00 00:17:36.746 [2024-05-15 04:18:24.578573] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.746 [2024-05-15 04:18:24.580352] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.746 [2024-05-15 04:18:24.580376] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:36.746 pt2 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc3 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt3 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:36.746 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:37.004 malloc3 00:17:37.004 04:18:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:37.262 [2024-05-15 04:18:25.063230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:37.262 [2024-05-15 04:18:25.063297] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.262 [2024-05-15 04:18:25.063324] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22e79c0 00:17:37.262 [2024-05-15 04:18:25.063337] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.262 [2024-05-15 04:18:25.064749] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.262 [2024-05-15 04:18:25.064772] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:37.262 pt3 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc4 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt4 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:37.262 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:37.520 malloc4 00:17:37.520 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:37.778 [2024-05-15 04:18:25.586206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:37.778 [2024-05-15 04:18:25.586261] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.778 [2024-05-15 04:18:25.586287] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213a8e0 00:17:37.778 [2024-05-15 04:18:25.586304] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.778 [2024-05-15 04:18:25.587835] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.778 [2024-05-15 04:18:25.587864] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:37.778 pt4 00:17:37.778 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:17:37.778 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:17:37.778 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:38.036 [2024-05-15 04:18:25.875019] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:38.036 [2024-05-15 04:18:25.876520] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:38.036 [2024-05-15 04:18:25.876585] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:38.036 [2024-05-15 04:18:25.876656] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:38.036 [2024-05-15 04:18:25.876902] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x2139cc0 00:17:38.036 [2024-05-15 04:18:25.876920] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:38.036 [2024-05-15 04:18:25.877166] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2137190 00:17:38.036 [2024-05-15 04:18:25.877362] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2139cc0 00:17:38.036 [2024-05-15 04:18:25.877378] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2139cc0 00:17:38.036 [2024-05-15 04:18:25.877529] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.036 04:18:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.293 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:38.293 "name": "raid_bdev1", 00:17:38.293 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:38.293 "strip_size_kb": 0, 00:17:38.293 "state": "online", 00:17:38.293 "raid_level": "raid1", 00:17:38.293 "superblock": true, 00:17:38.293 "num_base_bdevs": 4, 00:17:38.293 "num_base_bdevs_discovered": 4, 00:17:38.293 "num_base_bdevs_operational": 4, 00:17:38.293 "base_bdevs_list": [ 00:17:38.293 { 00:17:38.293 "name": "pt1", 00:17:38.293 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:38.293 "is_configured": true, 00:17:38.293 "data_offset": 2048, 00:17:38.293 "data_size": 63488 00:17:38.293 }, 00:17:38.293 { 00:17:38.293 "name": "pt2", 00:17:38.293 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:38.293 "is_configured": true, 00:17:38.293 "data_offset": 2048, 00:17:38.294 "data_size": 63488 00:17:38.294 }, 00:17:38.294 { 00:17:38.294 "name": "pt3", 00:17:38.294 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:38.294 "is_configured": true, 00:17:38.294 "data_offset": 2048, 00:17:38.294 "data_size": 63488 00:17:38.294 }, 00:17:38.294 { 00:17:38.294 "name": "pt4", 00:17:38.294 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:38.294 "is_configured": true, 00:17:38.294 "data_offset": 2048, 00:17:38.294 "data_size": 63488 00:17:38.294 } 00:17:38.294 ] 00:17:38.294 }' 00:17:38.294 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:38.294 04:18:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:38.924 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:38.924 [2024-05-15 04:18:26.906048] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:39.182 "name": "raid_bdev1", 00:17:39.182 "aliases": [ 00:17:39.182 "4726f6f2-39c4-4c84-8d93-c699caa50090" 00:17:39.182 ], 00:17:39.182 "product_name": "Raid Volume", 00:17:39.182 "block_size": 512, 00:17:39.182 "num_blocks": 63488, 00:17:39.182 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:39.182 "assigned_rate_limits": { 00:17:39.182 "rw_ios_per_sec": 0, 00:17:39.182 "rw_mbytes_per_sec": 0, 00:17:39.182 "r_mbytes_per_sec": 0, 00:17:39.182 "w_mbytes_per_sec": 0 00:17:39.182 }, 00:17:39.182 "claimed": false, 00:17:39.182 "zoned": false, 00:17:39.182 "supported_io_types": { 00:17:39.182 "read": true, 00:17:39.182 "write": true, 00:17:39.182 "unmap": false, 00:17:39.182 "write_zeroes": true, 00:17:39.182 "flush": false, 00:17:39.182 "reset": true, 00:17:39.182 "compare": false, 00:17:39.182 "compare_and_write": false, 00:17:39.182 "abort": false, 00:17:39.182 "nvme_admin": false, 00:17:39.182 "nvme_io": false 00:17:39.182 }, 00:17:39.182 "memory_domains": [ 00:17:39.182 { 00:17:39.182 "dma_device_id": "system", 00:17:39.182 "dma_device_type": 1 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.182 "dma_device_type": 2 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "system", 00:17:39.182 "dma_device_type": 1 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.182 "dma_device_type": 2 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "system", 00:17:39.182 "dma_device_type": 1 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.182 "dma_device_type": 2 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "system", 00:17:39.182 "dma_device_type": 1 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.182 "dma_device_type": 2 00:17:39.182 } 00:17:39.182 ], 00:17:39.182 "driver_specific": { 00:17:39.182 "raid": { 00:17:39.182 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:39.182 "strip_size_kb": 0, 00:17:39.182 "state": "online", 00:17:39.182 "raid_level": "raid1", 00:17:39.182 "superblock": true, 00:17:39.182 "num_base_bdevs": 4, 00:17:39.182 "num_base_bdevs_discovered": 4, 00:17:39.182 "num_base_bdevs_operational": 4, 00:17:39.182 "base_bdevs_list": [ 00:17:39.182 { 00:17:39.182 "name": "pt1", 00:17:39.182 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:39.182 "is_configured": true, 00:17:39.182 "data_offset": 2048, 00:17:39.182 "data_size": 63488 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "name": "pt2", 00:17:39.182 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:39.182 "is_configured": true, 00:17:39.182 "data_offset": 2048, 00:17:39.182 "data_size": 63488 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "name": "pt3", 00:17:39.182 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:39.182 "is_configured": true, 00:17:39.182 "data_offset": 2048, 00:17:39.182 "data_size": 63488 00:17:39.182 }, 00:17:39.182 { 00:17:39.182 "name": "pt4", 00:17:39.182 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:39.182 "is_configured": true, 00:17:39.182 "data_offset": 2048, 00:17:39.182 "data_size": 63488 00:17:39.182 } 00:17:39.182 ] 00:17:39.182 } 00:17:39.182 } 00:17:39.182 }' 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:17:39.182 pt2 00:17:39.182 pt3 00:17:39.182 pt4' 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:39.182 04:18:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:39.440 "name": "pt1", 00:17:39.440 "aliases": [ 00:17:39.440 "a7588dc9-7918-5088-aca8-2dcccd6b69e5" 00:17:39.440 ], 00:17:39.440 "product_name": "passthru", 00:17:39.440 "block_size": 512, 00:17:39.440 "num_blocks": 65536, 00:17:39.440 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:39.440 "assigned_rate_limits": { 00:17:39.440 "rw_ios_per_sec": 0, 00:17:39.440 "rw_mbytes_per_sec": 0, 00:17:39.440 "r_mbytes_per_sec": 0, 00:17:39.440 "w_mbytes_per_sec": 0 00:17:39.440 }, 00:17:39.440 "claimed": true, 00:17:39.440 "claim_type": "exclusive_write", 00:17:39.440 "zoned": false, 00:17:39.440 "supported_io_types": { 00:17:39.440 "read": true, 00:17:39.440 "write": true, 00:17:39.440 "unmap": true, 00:17:39.440 "write_zeroes": true, 00:17:39.440 "flush": true, 00:17:39.440 "reset": true, 00:17:39.440 "compare": false, 00:17:39.440 "compare_and_write": false, 00:17:39.440 "abort": true, 00:17:39.440 "nvme_admin": false, 00:17:39.440 "nvme_io": false 00:17:39.440 }, 00:17:39.440 "memory_domains": [ 00:17:39.440 { 00:17:39.440 "dma_device_id": "system", 00:17:39.440 "dma_device_type": 1 00:17:39.440 }, 00:17:39.440 { 00:17:39.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.440 "dma_device_type": 2 00:17:39.440 } 00:17:39.440 ], 00:17:39.440 "driver_specific": { 00:17:39.440 "passthru": { 00:17:39.440 "name": "pt1", 00:17:39.440 "base_bdev_name": "malloc1" 00:17:39.440 } 00:17:39.440 } 00:17:39.440 }' 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:39.440 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.698 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:39.698 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:39.698 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:39.698 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:39.698 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:39.956 "name": "pt2", 00:17:39.956 "aliases": [ 00:17:39.956 "b0897720-3e79-5eae-aa46-62f78ed92740" 00:17:39.956 ], 00:17:39.956 "product_name": "passthru", 00:17:39.956 "block_size": 512, 00:17:39.956 "num_blocks": 65536, 00:17:39.956 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:39.956 "assigned_rate_limits": { 00:17:39.956 "rw_ios_per_sec": 0, 00:17:39.956 "rw_mbytes_per_sec": 0, 00:17:39.956 "r_mbytes_per_sec": 0, 00:17:39.956 "w_mbytes_per_sec": 0 00:17:39.956 }, 00:17:39.956 "claimed": true, 00:17:39.956 "claim_type": "exclusive_write", 00:17:39.956 "zoned": false, 00:17:39.956 "supported_io_types": { 00:17:39.956 "read": true, 00:17:39.956 "write": true, 00:17:39.956 "unmap": true, 00:17:39.956 "write_zeroes": true, 00:17:39.956 "flush": true, 00:17:39.956 "reset": true, 00:17:39.956 "compare": false, 00:17:39.956 "compare_and_write": false, 00:17:39.956 "abort": true, 00:17:39.956 "nvme_admin": false, 00:17:39.956 "nvme_io": false 00:17:39.956 }, 00:17:39.956 "memory_domains": [ 00:17:39.956 { 00:17:39.956 "dma_device_id": "system", 00:17:39.956 "dma_device_type": 1 00:17:39.956 }, 00:17:39.956 { 00:17:39.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.956 "dma_device_type": 2 00:17:39.956 } 00:17:39.956 ], 00:17:39.956 "driver_specific": { 00:17:39.956 "passthru": { 00:17:39.956 "name": "pt2", 00:17:39.956 "base_bdev_name": "malloc2" 00:17:39.956 } 00:17:39.956 } 00:17:39.956 }' 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:39.956 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:40.213 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.213 04:18:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:40.213 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:40.214 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:40.214 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:40.214 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:40.214 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:40.471 "name": "pt3", 00:17:40.471 "aliases": [ 00:17:40.471 "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0" 00:17:40.471 ], 00:17:40.471 "product_name": "passthru", 00:17:40.471 "block_size": 512, 00:17:40.471 "num_blocks": 65536, 00:17:40.471 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:40.471 "assigned_rate_limits": { 00:17:40.471 "rw_ios_per_sec": 0, 00:17:40.471 "rw_mbytes_per_sec": 0, 00:17:40.471 "r_mbytes_per_sec": 0, 00:17:40.471 "w_mbytes_per_sec": 0 00:17:40.471 }, 00:17:40.471 "claimed": true, 00:17:40.471 "claim_type": "exclusive_write", 00:17:40.471 "zoned": false, 00:17:40.471 "supported_io_types": { 00:17:40.471 "read": true, 00:17:40.471 "write": true, 00:17:40.471 "unmap": true, 00:17:40.471 "write_zeroes": true, 00:17:40.471 "flush": true, 00:17:40.471 "reset": true, 00:17:40.471 "compare": false, 00:17:40.471 "compare_and_write": false, 00:17:40.471 "abort": true, 00:17:40.471 "nvme_admin": false, 00:17:40.471 "nvme_io": false 00:17:40.471 }, 00:17:40.471 "memory_domains": [ 00:17:40.471 { 00:17:40.471 "dma_device_id": "system", 00:17:40.471 "dma_device_type": 1 00:17:40.471 }, 00:17:40.471 { 00:17:40.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.471 "dma_device_type": 2 00:17:40.471 } 00:17:40.471 ], 00:17:40.471 "driver_specific": { 00:17:40.471 "passthru": { 00:17:40.471 "name": "pt3", 00:17:40.471 "base_bdev_name": "malloc3" 00:17:40.471 } 00:17:40.471 } 00:17:40.471 }' 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:40.471 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:40.729 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:40.987 "name": "pt4", 00:17:40.987 "aliases": [ 00:17:40.987 "5a079308-2db7-5bea-9c7d-4fde0b5606e5" 00:17:40.987 ], 00:17:40.987 "product_name": "passthru", 00:17:40.987 "block_size": 512, 00:17:40.987 "num_blocks": 65536, 00:17:40.987 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:40.987 "assigned_rate_limits": { 00:17:40.987 "rw_ios_per_sec": 0, 00:17:40.987 "rw_mbytes_per_sec": 0, 00:17:40.987 "r_mbytes_per_sec": 0, 00:17:40.987 "w_mbytes_per_sec": 0 00:17:40.987 }, 00:17:40.987 "claimed": true, 00:17:40.987 "claim_type": "exclusive_write", 00:17:40.987 "zoned": false, 00:17:40.987 "supported_io_types": { 00:17:40.987 "read": true, 00:17:40.987 "write": true, 00:17:40.987 "unmap": true, 00:17:40.987 "write_zeroes": true, 00:17:40.987 "flush": true, 00:17:40.987 "reset": true, 00:17:40.987 "compare": false, 00:17:40.987 "compare_and_write": false, 00:17:40.987 "abort": true, 00:17:40.987 "nvme_admin": false, 00:17:40.987 "nvme_io": false 00:17:40.987 }, 00:17:40.987 "memory_domains": [ 00:17:40.987 { 00:17:40.987 "dma_device_id": "system", 00:17:40.987 "dma_device_type": 1 00:17:40.987 }, 00:17:40.987 { 00:17:40.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.987 "dma_device_type": 2 00:17:40.987 } 00:17:40.987 ], 00:17:40.987 "driver_specific": { 00:17:40.987 "passthru": { 00:17:40.987 "name": "pt4", 00:17:40.987 "base_bdev_name": "malloc4" 00:17:40.987 } 00:17:40.987 } 00:17:40.987 }' 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:40.987 04:18:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:41.245 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:17:41.503 [2024-05-15 04:18:29.384657] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:41.503 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=4726f6f2-39c4-4c84-8d93-c699caa50090 00:17:41.503 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # '[' -z 4726f6f2-39c4-4c84-8d93-c699caa50090 ']' 00:17:41.503 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:41.761 [2024-05-15 04:18:29.665124] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:41.761 [2024-05-15 04:18:29.665152] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:41.761 [2024-05-15 04:18:29.665231] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:41.761 [2024-05-15 04:18:29.665325] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:41.761 [2024-05-15 04:18:29.665339] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2139cc0 name raid_bdev1, state offline 00:17:41.761 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.761 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:17:42.018 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:17:42.018 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:17:42.018 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.018 04:18:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:42.276 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.276 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:42.534 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.534 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:42.792 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:17:42.792 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:43.050 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:43.050 04:18:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:43.308 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:43.566 [2024-05-15 04:18:31.429749] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:43.566 [2024-05-15 04:18:31.431039] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:43.566 [2024-05-15 04:18:31.431083] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:43.566 [2024-05-15 04:18:31.431145] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:43.566 [2024-05-15 04:18:31.431217] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:43.566 [2024-05-15 04:18:31.431274] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:43.566 [2024-05-15 04:18:31.431300] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:43.566 [2024-05-15 04:18:31.431324] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:43.566 [2024-05-15 04:18:31.431343] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:43.566 [2024-05-15 04:18:31.431354] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213e2d0 name raid_bdev1, state configuring 00:17:43.566 request: 00:17:43.566 { 00:17:43.566 "name": "raid_bdev1", 00:17:43.566 "raid_level": "raid1", 00:17:43.566 "base_bdevs": [ 00:17:43.566 "malloc1", 00:17:43.566 "malloc2", 00:17:43.566 "malloc3", 00:17:43.566 "malloc4" 00:17:43.566 ], 00:17:43.566 "superblock": false, 00:17:43.566 "method": "bdev_raid_create", 00:17:43.566 "req_id": 1 00:17:43.566 } 00:17:43.566 Got JSON-RPC error response 00:17:43.566 response: 00:17:43.566 { 00:17:43.566 "code": -17, 00:17:43.566 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:43.566 } 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.566 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:17:43.824 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:17:43.824 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:17:43.824 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:44.082 [2024-05-15 04:18:31.910978] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:44.082 [2024-05-15 04:18:31.911026] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.082 [2024-05-15 04:18:31.911051] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213ca80 00:17:44.082 [2024-05-15 04:18:31.911065] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.082 [2024-05-15 04:18:31.912505] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.082 [2024-05-15 04:18:31.912528] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:44.082 [2024-05-15 04:18:31.912605] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:44.082 [2024-05-15 04:18:31.912639] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:44.082 pt1 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.082 04:18:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:44.340 04:18:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:44.340 "name": "raid_bdev1", 00:17:44.340 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:44.340 "strip_size_kb": 0, 00:17:44.340 "state": "configuring", 00:17:44.340 "raid_level": "raid1", 00:17:44.340 "superblock": true, 00:17:44.340 "num_base_bdevs": 4, 00:17:44.340 "num_base_bdevs_discovered": 1, 00:17:44.340 "num_base_bdevs_operational": 4, 00:17:44.340 "base_bdevs_list": [ 00:17:44.340 { 00:17:44.340 "name": "pt1", 00:17:44.340 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:44.340 "is_configured": true, 00:17:44.340 "data_offset": 2048, 00:17:44.340 "data_size": 63488 00:17:44.340 }, 00:17:44.340 { 00:17:44.340 "name": null, 00:17:44.340 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:44.340 "is_configured": false, 00:17:44.340 "data_offset": 2048, 00:17:44.340 "data_size": 63488 00:17:44.340 }, 00:17:44.340 { 00:17:44.340 "name": null, 00:17:44.340 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:44.340 "is_configured": false, 00:17:44.340 "data_offset": 2048, 00:17:44.340 "data_size": 63488 00:17:44.340 }, 00:17:44.340 { 00:17:44.340 "name": null, 00:17:44.340 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:44.340 "is_configured": false, 00:17:44.340 "data_offset": 2048, 00:17:44.340 "data_size": 63488 00:17:44.340 } 00:17:44.340 ] 00:17:44.340 }' 00:17:44.340 04:18:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:44.340 04:18:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.907 04:18:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@470 -- # '[' 4 -gt 2 ']' 00:17:44.907 04:18:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:45.166 [2024-05-15 04:18:32.953816] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:45.166 [2024-05-15 04:18:32.953889] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.166 [2024-05-15 04:18:32.953914] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2138590 00:17:45.166 [2024-05-15 04:18:32.953927] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.166 [2024-05-15 04:18:32.954297] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.166 [2024-05-15 04:18:32.954318] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:45.166 [2024-05-15 04:18:32.954392] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:45.166 [2024-05-15 04:18:32.954417] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:45.166 pt2 00:17:45.166 04:18:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:45.424 [2024-05-15 04:18:33.198491] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.424 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:45.682 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:45.682 "name": "raid_bdev1", 00:17:45.682 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:45.682 "strip_size_kb": 0, 00:17:45.682 "state": "configuring", 00:17:45.682 "raid_level": "raid1", 00:17:45.682 "superblock": true, 00:17:45.682 "num_base_bdevs": 4, 00:17:45.682 "num_base_bdevs_discovered": 1, 00:17:45.682 "num_base_bdevs_operational": 4, 00:17:45.682 "base_bdevs_list": [ 00:17:45.682 { 00:17:45.682 "name": "pt1", 00:17:45.682 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:45.682 "is_configured": true, 00:17:45.682 "data_offset": 2048, 00:17:45.682 "data_size": 63488 00:17:45.682 }, 00:17:45.682 { 00:17:45.682 "name": null, 00:17:45.682 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:45.682 "is_configured": false, 00:17:45.682 "data_offset": 2048, 00:17:45.682 "data_size": 63488 00:17:45.682 }, 00:17:45.682 { 00:17:45.682 "name": null, 00:17:45.682 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:45.682 "is_configured": false, 00:17:45.682 "data_offset": 2048, 00:17:45.682 "data_size": 63488 00:17:45.682 }, 00:17:45.682 { 00:17:45.682 "name": null, 00:17:45.682 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:45.682 "is_configured": false, 00:17:45.682 "data_offset": 2048, 00:17:45.682 "data_size": 63488 00:17:45.682 } 00:17:45.682 ] 00:17:45.682 }' 00:17:45.682 04:18:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:45.682 04:18:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:46.251 [2024-05-15 04:18:34.225223] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:46.251 [2024-05-15 04:18:34.225287] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.251 [2024-05-15 04:18:34.225315] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213c7e0 00:17:46.251 [2024-05-15 04:18:34.225331] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.251 [2024-05-15 04:18:34.225729] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.251 [2024-05-15 04:18:34.225757] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:46.251 [2024-05-15 04:18:34.225847] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:46.251 [2024-05-15 04:18:34.225877] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:46.251 pt2 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:46.251 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:46.510 [2024-05-15 04:18:34.497954] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:46.510 [2024-05-15 04:18:34.498008] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.510 [2024-05-15 04:18:34.498034] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213cd80 00:17:46.510 [2024-05-15 04:18:34.498050] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.510 [2024-05-15 04:18:34.498458] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.510 [2024-05-15 04:18:34.498484] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:46.510 [2024-05-15 04:18:34.498565] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:46.510 [2024-05-15 04:18:34.498596] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:46.510 pt3 00:17:46.510 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:46.510 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:46.510 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:46.769 [2024-05-15 04:18:34.742584] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:46.769 [2024-05-15 04:18:34.742631] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.769 [2024-05-15 04:18:34.742652] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213f780 00:17:46.769 [2024-05-15 04:18:34.742667] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.769 [2024-05-15 04:18:34.742956] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.769 [2024-05-15 04:18:34.742983] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:46.769 [2024-05-15 04:18:34.743040] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:46.769 [2024-05-15 04:18:34.743066] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:46.769 [2024-05-15 04:18:34.743200] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x213af10 00:17:46.769 [2024-05-15 04:18:34.743216] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:46.769 [2024-05-15 04:18:34.743385] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2135f00 00:17:46.769 [2024-05-15 04:18:34.743550] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213af10 00:17:46.769 [2024-05-15 04:18:34.743566] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213af10 00:17:46.769 [2024-05-15 04:18:34.743677] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:46.769 pt4 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.769 04:18:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:47.027 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:47.027 "name": "raid_bdev1", 00:17:47.027 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:47.027 "strip_size_kb": 0, 00:17:47.027 "state": "online", 00:17:47.027 "raid_level": "raid1", 00:17:47.027 "superblock": true, 00:17:47.027 "num_base_bdevs": 4, 00:17:47.027 "num_base_bdevs_discovered": 4, 00:17:47.027 "num_base_bdevs_operational": 4, 00:17:47.027 "base_bdevs_list": [ 00:17:47.027 { 00:17:47.027 "name": "pt1", 00:17:47.027 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:47.027 "is_configured": true, 00:17:47.027 "data_offset": 2048, 00:17:47.027 "data_size": 63488 00:17:47.027 }, 00:17:47.027 { 00:17:47.027 "name": "pt2", 00:17:47.027 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:47.027 "is_configured": true, 00:17:47.027 "data_offset": 2048, 00:17:47.027 "data_size": 63488 00:17:47.027 }, 00:17:47.027 { 00:17:47.027 "name": "pt3", 00:17:47.027 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:47.027 "is_configured": true, 00:17:47.027 "data_offset": 2048, 00:17:47.027 "data_size": 63488 00:17:47.027 }, 00:17:47.027 { 00:17:47.027 "name": "pt4", 00:17:47.027 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:47.027 "is_configured": true, 00:17:47.027 "data_offset": 2048, 00:17:47.027 "data_size": 63488 00:17:47.027 } 00:17:47.027 ] 00:17:47.027 }' 00:17:47.027 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:47.027 04:18:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@199 -- # local name 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:17:47.960 [2024-05-15 04:18:35.829741] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:47.960 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:17:47.960 "name": "raid_bdev1", 00:17:47.960 "aliases": [ 00:17:47.960 "4726f6f2-39c4-4c84-8d93-c699caa50090" 00:17:47.960 ], 00:17:47.960 "product_name": "Raid Volume", 00:17:47.960 "block_size": 512, 00:17:47.960 "num_blocks": 63488, 00:17:47.960 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:47.960 "assigned_rate_limits": { 00:17:47.960 "rw_ios_per_sec": 0, 00:17:47.960 "rw_mbytes_per_sec": 0, 00:17:47.960 "r_mbytes_per_sec": 0, 00:17:47.960 "w_mbytes_per_sec": 0 00:17:47.960 }, 00:17:47.960 "claimed": false, 00:17:47.960 "zoned": false, 00:17:47.960 "supported_io_types": { 00:17:47.960 "read": true, 00:17:47.960 "write": true, 00:17:47.960 "unmap": false, 00:17:47.960 "write_zeroes": true, 00:17:47.960 "flush": false, 00:17:47.960 "reset": true, 00:17:47.960 "compare": false, 00:17:47.960 "compare_and_write": false, 00:17:47.960 "abort": false, 00:17:47.960 "nvme_admin": false, 00:17:47.960 "nvme_io": false 00:17:47.960 }, 00:17:47.960 "memory_domains": [ 00:17:47.960 { 00:17:47.960 "dma_device_id": "system", 00:17:47.960 "dma_device_type": 1 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.960 "dma_device_type": 2 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "system", 00:17:47.960 "dma_device_type": 1 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.960 "dma_device_type": 2 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "system", 00:17:47.960 "dma_device_type": 1 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.960 "dma_device_type": 2 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "system", 00:17:47.960 "dma_device_type": 1 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.960 "dma_device_type": 2 00:17:47.960 } 00:17:47.960 ], 00:17:47.960 "driver_specific": { 00:17:47.960 "raid": { 00:17:47.960 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:47.960 "strip_size_kb": 0, 00:17:47.960 "state": "online", 00:17:47.960 "raid_level": "raid1", 00:17:47.960 "superblock": true, 00:17:47.960 "num_base_bdevs": 4, 00:17:47.960 "num_base_bdevs_discovered": 4, 00:17:47.960 "num_base_bdevs_operational": 4, 00:17:47.960 "base_bdevs_list": [ 00:17:47.960 { 00:17:47.960 "name": "pt1", 00:17:47.960 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:47.960 "is_configured": true, 00:17:47.960 "data_offset": 2048, 00:17:47.960 "data_size": 63488 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "name": "pt2", 00:17:47.960 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:47.960 "is_configured": true, 00:17:47.960 "data_offset": 2048, 00:17:47.960 "data_size": 63488 00:17:47.960 }, 00:17:47.960 { 00:17:47.960 "name": "pt3", 00:17:47.960 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:47.961 "is_configured": true, 00:17:47.961 "data_offset": 2048, 00:17:47.961 "data_size": 63488 00:17:47.961 }, 00:17:47.961 { 00:17:47.961 "name": "pt4", 00:17:47.961 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:47.961 "is_configured": true, 00:17:47.961 "data_offset": 2048, 00:17:47.961 "data_size": 63488 00:17:47.961 } 00:17:47.961 ] 00:17:47.961 } 00:17:47.961 } 00:17:47.961 }' 00:17:47.961 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:47.961 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:17:47.961 pt2 00:17:47.961 pt3 00:17:47.961 pt4' 00:17:47.961 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:47.961 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:47.961 04:18:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:48.219 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:48.219 "name": "pt1", 00:17:48.219 "aliases": [ 00:17:48.219 "a7588dc9-7918-5088-aca8-2dcccd6b69e5" 00:17:48.219 ], 00:17:48.219 "product_name": "passthru", 00:17:48.219 "block_size": 512, 00:17:48.219 "num_blocks": 65536, 00:17:48.219 "uuid": "a7588dc9-7918-5088-aca8-2dcccd6b69e5", 00:17:48.219 "assigned_rate_limits": { 00:17:48.219 "rw_ios_per_sec": 0, 00:17:48.219 "rw_mbytes_per_sec": 0, 00:17:48.219 "r_mbytes_per_sec": 0, 00:17:48.219 "w_mbytes_per_sec": 0 00:17:48.219 }, 00:17:48.219 "claimed": true, 00:17:48.219 "claim_type": "exclusive_write", 00:17:48.219 "zoned": false, 00:17:48.219 "supported_io_types": { 00:17:48.219 "read": true, 00:17:48.219 "write": true, 00:17:48.219 "unmap": true, 00:17:48.219 "write_zeroes": true, 00:17:48.219 "flush": true, 00:17:48.219 "reset": true, 00:17:48.219 "compare": false, 00:17:48.219 "compare_and_write": false, 00:17:48.219 "abort": true, 00:17:48.219 "nvme_admin": false, 00:17:48.219 "nvme_io": false 00:17:48.219 }, 00:17:48.219 "memory_domains": [ 00:17:48.219 { 00:17:48.219 "dma_device_id": "system", 00:17:48.219 "dma_device_type": 1 00:17:48.219 }, 00:17:48.219 { 00:17:48.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.219 "dma_device_type": 2 00:17:48.219 } 00:17:48.219 ], 00:17:48.219 "driver_specific": { 00:17:48.219 "passthru": { 00:17:48.219 "name": "pt1", 00:17:48.219 "base_bdev_name": "malloc1" 00:17:48.219 } 00:17:48.219 } 00:17:48.219 }' 00:17:48.219 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.219 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.219 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:48.219 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:48.477 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:48.735 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:48.735 "name": "pt2", 00:17:48.735 "aliases": [ 00:17:48.735 "b0897720-3e79-5eae-aa46-62f78ed92740" 00:17:48.735 ], 00:17:48.735 "product_name": "passthru", 00:17:48.735 "block_size": 512, 00:17:48.735 "num_blocks": 65536, 00:17:48.735 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:48.735 "assigned_rate_limits": { 00:17:48.735 "rw_ios_per_sec": 0, 00:17:48.735 "rw_mbytes_per_sec": 0, 00:17:48.735 "r_mbytes_per_sec": 0, 00:17:48.735 "w_mbytes_per_sec": 0 00:17:48.735 }, 00:17:48.735 "claimed": true, 00:17:48.735 "claim_type": "exclusive_write", 00:17:48.735 "zoned": false, 00:17:48.735 "supported_io_types": { 00:17:48.735 "read": true, 00:17:48.735 "write": true, 00:17:48.735 "unmap": true, 00:17:48.735 "write_zeroes": true, 00:17:48.735 "flush": true, 00:17:48.735 "reset": true, 00:17:48.735 "compare": false, 00:17:48.735 "compare_and_write": false, 00:17:48.735 "abort": true, 00:17:48.735 "nvme_admin": false, 00:17:48.735 "nvme_io": false 00:17:48.735 }, 00:17:48.735 "memory_domains": [ 00:17:48.735 { 00:17:48.735 "dma_device_id": "system", 00:17:48.735 "dma_device_type": 1 00:17:48.735 }, 00:17:48.735 { 00:17:48.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.735 "dma_device_type": 2 00:17:48.735 } 00:17:48.735 ], 00:17:48.735 "driver_specific": { 00:17:48.735 "passthru": { 00:17:48.735 "name": "pt2", 00:17:48.735 "base_bdev_name": "malloc2" 00:17:48.735 } 00:17:48.735 } 00:17:48.735 }' 00:17:48.735 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.735 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:48.735 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:48.735 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:48.993 04:18:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:49.252 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:49.252 "name": "pt3", 00:17:49.252 "aliases": [ 00:17:49.252 "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0" 00:17:49.252 ], 00:17:49.252 "product_name": "passthru", 00:17:49.252 "block_size": 512, 00:17:49.252 "num_blocks": 65536, 00:17:49.252 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:49.252 "assigned_rate_limits": { 00:17:49.252 "rw_ios_per_sec": 0, 00:17:49.252 "rw_mbytes_per_sec": 0, 00:17:49.252 "r_mbytes_per_sec": 0, 00:17:49.252 "w_mbytes_per_sec": 0 00:17:49.252 }, 00:17:49.252 "claimed": true, 00:17:49.252 "claim_type": "exclusive_write", 00:17:49.252 "zoned": false, 00:17:49.252 "supported_io_types": { 00:17:49.252 "read": true, 00:17:49.252 "write": true, 00:17:49.252 "unmap": true, 00:17:49.252 "write_zeroes": true, 00:17:49.252 "flush": true, 00:17:49.252 "reset": true, 00:17:49.252 "compare": false, 00:17:49.252 "compare_and_write": false, 00:17:49.252 "abort": true, 00:17:49.252 "nvme_admin": false, 00:17:49.252 "nvme_io": false 00:17:49.252 }, 00:17:49.252 "memory_domains": [ 00:17:49.252 { 00:17:49.252 "dma_device_id": "system", 00:17:49.252 "dma_device_type": 1 00:17:49.252 }, 00:17:49.252 { 00:17:49.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:49.252 "dma_device_type": 2 00:17:49.252 } 00:17:49.252 ], 00:17:49.252 "driver_specific": { 00:17:49.252 "passthru": { 00:17:49.252 "name": "pt3", 00:17:49.252 "base_bdev_name": "malloc3" 00:17:49.252 } 00:17:49.252 } 00:17:49.252 }' 00:17:49.252 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:49.511 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:49.769 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:49.769 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:17:49.769 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:49.769 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:17:50.028 "name": "pt4", 00:17:50.028 "aliases": [ 00:17:50.028 "5a079308-2db7-5bea-9c7d-4fde0b5606e5" 00:17:50.028 ], 00:17:50.028 "product_name": "passthru", 00:17:50.028 "block_size": 512, 00:17:50.028 "num_blocks": 65536, 00:17:50.028 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:50.028 "assigned_rate_limits": { 00:17:50.028 "rw_ios_per_sec": 0, 00:17:50.028 "rw_mbytes_per_sec": 0, 00:17:50.028 "r_mbytes_per_sec": 0, 00:17:50.028 "w_mbytes_per_sec": 0 00:17:50.028 }, 00:17:50.028 "claimed": true, 00:17:50.028 "claim_type": "exclusive_write", 00:17:50.028 "zoned": false, 00:17:50.028 "supported_io_types": { 00:17:50.028 "read": true, 00:17:50.028 "write": true, 00:17:50.028 "unmap": true, 00:17:50.028 "write_zeroes": true, 00:17:50.028 "flush": true, 00:17:50.028 "reset": true, 00:17:50.028 "compare": false, 00:17:50.028 "compare_and_write": false, 00:17:50.028 "abort": true, 00:17:50.028 "nvme_admin": false, 00:17:50.028 "nvme_io": false 00:17:50.028 }, 00:17:50.028 "memory_domains": [ 00:17:50.028 { 00:17:50.028 "dma_device_id": "system", 00:17:50.028 "dma_device_type": 1 00:17:50.028 }, 00:17:50.028 { 00:17:50.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.028 "dma_device_type": 2 00:17:50.028 } 00:17:50.028 ], 00:17:50.028 "driver_specific": { 00:17:50.028 "passthru": { 00:17:50.028 "name": "pt4", 00:17:50.028 "base_bdev_name": "malloc4" 00:17:50.028 } 00:17:50.028 } 00:17:50.028 }' 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ 512 == 512 ]] 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:50.028 04:18:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:50.028 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:17:50.286 [2024-05-15 04:18:38.268234] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # '[' 4726f6f2-39c4-4c84-8d93-c699caa50090 '!=' 4726f6f2-39c4-4c84-8d93-c699caa50090 ']' 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # case $1 in 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 0 00:17:50.286 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:50.544 [2024-05-15 04:18:38.556830] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.802 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:51.060 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:51.060 "name": "raid_bdev1", 00:17:51.060 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:51.060 "strip_size_kb": 0, 00:17:51.060 "state": "online", 00:17:51.060 "raid_level": "raid1", 00:17:51.060 "superblock": true, 00:17:51.060 "num_base_bdevs": 4, 00:17:51.060 "num_base_bdevs_discovered": 3, 00:17:51.060 "num_base_bdevs_operational": 3, 00:17:51.060 "base_bdevs_list": [ 00:17:51.060 { 00:17:51.060 "name": null, 00:17:51.060 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.060 "is_configured": false, 00:17:51.060 "data_offset": 2048, 00:17:51.060 "data_size": 63488 00:17:51.060 }, 00:17:51.060 { 00:17:51.060 "name": "pt2", 00:17:51.060 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:51.060 "is_configured": true, 00:17:51.060 "data_offset": 2048, 00:17:51.060 "data_size": 63488 00:17:51.061 }, 00:17:51.061 { 00:17:51.061 "name": "pt3", 00:17:51.061 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:51.061 "is_configured": true, 00:17:51.061 "data_offset": 2048, 00:17:51.061 "data_size": 63488 00:17:51.061 }, 00:17:51.061 { 00:17:51.061 "name": "pt4", 00:17:51.061 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:51.061 "is_configured": true, 00:17:51.061 "data_offset": 2048, 00:17:51.061 "data_size": 63488 00:17:51.061 } 00:17:51.061 ] 00:17:51.061 }' 00:17:51.061 04:18:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:51.061 04:18:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.627 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:51.885 [2024-05-15 04:18:39.679748] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:51.885 [2024-05-15 04:18:39.679783] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:51.885 [2024-05-15 04:18:39.679871] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:51.885 [2024-05-15 04:18:39.679966] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:51.885 [2024-05-15 04:18:39.679984] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213af10 name raid_bdev1, state offline 00:17:51.885 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.885 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:17:52.142 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:17:52.142 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:17:52.142 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:17:52.142 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:17:52.142 04:18:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:52.400 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:17:52.400 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:17:52.400 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:52.656 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:17:52.656 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:17:52.656 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:52.912 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:17:52.912 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:17:52.912 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:17:52.912 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:17:52.912 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:53.169 [2024-05-15 04:18:40.951045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:53.169 [2024-05-15 04:18:40.951113] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.169 [2024-05-15 04:18:40.951139] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213fed0 00:17:53.169 [2024-05-15 04:18:40.951155] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.169 [2024-05-15 04:18:40.952880] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.169 [2024-05-15 04:18:40.952909] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:53.169 [2024-05-15 04:18:40.952986] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:53.170 [2024-05-15 04:18:40.953028] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:53.170 pt2 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.170 04:18:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:53.428 04:18:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:53.428 "name": "raid_bdev1", 00:17:53.428 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:53.428 "strip_size_kb": 0, 00:17:53.428 "state": "configuring", 00:17:53.428 "raid_level": "raid1", 00:17:53.428 "superblock": true, 00:17:53.428 "num_base_bdevs": 4, 00:17:53.428 "num_base_bdevs_discovered": 1, 00:17:53.428 "num_base_bdevs_operational": 3, 00:17:53.428 "base_bdevs_list": [ 00:17:53.428 { 00:17:53.428 "name": null, 00:17:53.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.428 "is_configured": false, 00:17:53.428 "data_offset": 2048, 00:17:53.428 "data_size": 63488 00:17:53.428 }, 00:17:53.428 { 00:17:53.428 "name": "pt2", 00:17:53.428 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:53.428 "is_configured": true, 00:17:53.428 "data_offset": 2048, 00:17:53.428 "data_size": 63488 00:17:53.428 }, 00:17:53.428 { 00:17:53.428 "name": null, 00:17:53.428 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:53.428 "is_configured": false, 00:17:53.428 "data_offset": 2048, 00:17:53.428 "data_size": 63488 00:17:53.428 }, 00:17:53.428 { 00:17:53.428 "name": null, 00:17:53.428 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:53.428 "is_configured": false, 00:17:53.428 "data_offset": 2048, 00:17:53.428 "data_size": 63488 00:17:53.428 } 00:17:53.428 ] 00:17:53.428 }' 00:17:53.428 04:18:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:53.428 04:18:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:53.996 04:18:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:17:53.996 04:18:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:17:53.996 04:18:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@512 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:54.253 [2024-05-15 04:18:42.070034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:54.253 [2024-05-15 04:18:42.070100] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.253 [2024-05-15 04:18:42.070130] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213f0c0 00:17:54.253 [2024-05-15 04:18:42.070146] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.253 [2024-05-15 04:18:42.070583] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.253 [2024-05-15 04:18:42.070610] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:54.254 [2024-05-15 04:18:42.070703] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:54.254 [2024-05-15 04:18:42.070735] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:54.254 pt3 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.254 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:54.511 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:54.511 "name": "raid_bdev1", 00:17:54.511 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:54.511 "strip_size_kb": 0, 00:17:54.511 "state": "configuring", 00:17:54.511 "raid_level": "raid1", 00:17:54.511 "superblock": true, 00:17:54.511 "num_base_bdevs": 4, 00:17:54.511 "num_base_bdevs_discovered": 2, 00:17:54.511 "num_base_bdevs_operational": 3, 00:17:54.511 "base_bdevs_list": [ 00:17:54.511 { 00:17:54.511 "name": null, 00:17:54.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.511 "is_configured": false, 00:17:54.511 "data_offset": 2048, 00:17:54.511 "data_size": 63488 00:17:54.511 }, 00:17:54.511 { 00:17:54.511 "name": "pt2", 00:17:54.511 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:54.511 "is_configured": true, 00:17:54.511 "data_offset": 2048, 00:17:54.511 "data_size": 63488 00:17:54.511 }, 00:17:54.511 { 00:17:54.511 "name": "pt3", 00:17:54.511 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:54.511 "is_configured": true, 00:17:54.511 "data_offset": 2048, 00:17:54.511 "data_size": 63488 00:17:54.511 }, 00:17:54.511 { 00:17:54.511 "name": null, 00:17:54.511 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:54.511 "is_configured": false, 00:17:54.511 "data_offset": 2048, 00:17:54.511 "data_size": 63488 00:17:54.511 } 00:17:54.511 ] 00:17:54.511 }' 00:17:54.511 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:54.511 04:18:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.075 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i++ )) 00:17:55.075 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:17:55.075 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # i=3 00:17:55.075 04:18:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:55.332 [2024-05-15 04:18:43.176955] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:55.332 [2024-05-15 04:18:43.177021] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.332 [2024-05-15 04:18:43.177045] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x213c510 00:17:55.332 [2024-05-15 04:18:43.177059] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.332 [2024-05-15 04:18:43.177434] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.332 [2024-05-15 04:18:43.177456] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:55.332 [2024-05-15 04:18:43.177531] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:55.332 [2024-05-15 04:18:43.177555] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:55.332 [2024-05-15 04:18:43.177671] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x213b190 00:17:55.332 [2024-05-15 04:18:43.177685] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:55.332 [2024-05-15 04:18:43.177848] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2139fa0 00:17:55.332 [2024-05-15 04:18:43.177982] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213b190 00:17:55.332 [2024-05-15 04:18:43.177996] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213b190 00:17:55.332 [2024-05-15 04:18:43.178092] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.332 pt4 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.332 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:55.590 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:55.590 "name": "raid_bdev1", 00:17:55.590 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:55.590 "strip_size_kb": 0, 00:17:55.590 "state": "online", 00:17:55.590 "raid_level": "raid1", 00:17:55.590 "superblock": true, 00:17:55.590 "num_base_bdevs": 4, 00:17:55.590 "num_base_bdevs_discovered": 3, 00:17:55.590 "num_base_bdevs_operational": 3, 00:17:55.590 "base_bdevs_list": [ 00:17:55.590 { 00:17:55.590 "name": null, 00:17:55.590 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.590 "is_configured": false, 00:17:55.590 "data_offset": 2048, 00:17:55.590 "data_size": 63488 00:17:55.590 }, 00:17:55.590 { 00:17:55.590 "name": "pt2", 00:17:55.590 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:55.590 "is_configured": true, 00:17:55.590 "data_offset": 2048, 00:17:55.590 "data_size": 63488 00:17:55.590 }, 00:17:55.590 { 00:17:55.590 "name": "pt3", 00:17:55.590 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:55.590 "is_configured": true, 00:17:55.590 "data_offset": 2048, 00:17:55.590 "data_size": 63488 00:17:55.590 }, 00:17:55.590 { 00:17:55.590 "name": "pt4", 00:17:55.590 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:55.590 "is_configured": true, 00:17:55.590 "data_offset": 2048, 00:17:55.590 "data_size": 63488 00:17:55.590 } 00:17:55.590 ] 00:17:55.590 }' 00:17:55.590 04:18:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:55.590 04:18:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.153 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:56.410 [2024-05-15 04:18:44.328002] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:56.410 [2024-05-15 04:18:44.328035] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:56.410 [2024-05-15 04:18:44.328115] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:56.410 [2024-05-15 04:18:44.328204] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:56.410 [2024-05-15 04:18:44.328221] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213b190 name raid_bdev1, state offline 00:17:56.410 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.410 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:17:56.666 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:17:56.666 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:17:56.667 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@532 -- # '[' 4 -gt 2 ']' 00:17:56.667 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=3 00:17:56.667 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:56.923 04:18:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:57.181 [2024-05-15 04:18:45.170186] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:57.181 [2024-05-15 04:18:45.170240] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:57.181 [2024-05-15 04:18:45.170270] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2136f00 00:17:57.181 [2024-05-15 04:18:45.170287] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:57.181 [2024-05-15 04:18:45.172051] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:57.181 [2024-05-15 04:18:45.172081] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:57.181 [2024-05-15 04:18:45.172165] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:57.181 [2024-05-15 04:18:45.172208] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:57.181 [2024-05-15 04:18:45.172340] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:57.181 [2024-05-15 04:18:45.172360] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:57.181 [2024-05-15 04:18:45.172378] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213dbf0 name raid_bdev1, state configuring 00:17:57.181 [2024-05-15 04:18:45.172408] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:57.181 [2024-05-15 04:18:45.172506] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:57.181 pt1 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # '[' 4 -gt 2 ']' 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.181 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.746 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:57.746 "name": "raid_bdev1", 00:17:57.746 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:57.746 "strip_size_kb": 0, 00:17:57.746 "state": "configuring", 00:17:57.746 "raid_level": "raid1", 00:17:57.746 "superblock": true, 00:17:57.746 "num_base_bdevs": 4, 00:17:57.746 "num_base_bdevs_discovered": 2, 00:17:57.746 "num_base_bdevs_operational": 3, 00:17:57.746 "base_bdevs_list": [ 00:17:57.746 { 00:17:57.746 "name": null, 00:17:57.746 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:57.747 "is_configured": false, 00:17:57.747 "data_offset": 2048, 00:17:57.747 "data_size": 63488 00:17:57.747 }, 00:17:57.747 { 00:17:57.747 "name": "pt2", 00:17:57.747 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:57.747 "is_configured": true, 00:17:57.747 "data_offset": 2048, 00:17:57.747 "data_size": 63488 00:17:57.747 }, 00:17:57.747 { 00:17:57.747 "name": "pt3", 00:17:57.747 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:57.747 "is_configured": true, 00:17:57.747 "data_offset": 2048, 00:17:57.747 "data_size": 63488 00:17:57.747 }, 00:17:57.747 { 00:17:57.747 "name": null, 00:17:57.747 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:57.747 "is_configured": false, 00:17:57.747 "data_offset": 2048, 00:17:57.747 "data_size": 63488 00:17:57.747 } 00:17:57.747 ] 00:17:57.747 }' 00:17:57.747 04:18:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:57.747 04:18:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.312 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:58.312 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:58.312 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@546 -- # [[ false == \f\a\l\s\e ]] 00:17:58.312 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:58.584 [2024-05-15 04:18:46.573929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:58.584 [2024-05-15 04:18:46.574003] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:58.584 [2024-05-15 04:18:46.574031] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21362e0 00:17:58.584 [2024-05-15 04:18:46.574046] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:58.584 [2024-05-15 04:18:46.574452] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:58.584 [2024-05-15 04:18:46.574478] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:58.584 [2024-05-15 04:18:46.574563] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:58.584 [2024-05-15 04:18:46.574594] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:58.584 [2024-05-15 04:18:46.574732] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x213c030 00:17:58.584 [2024-05-15 04:18:46.574748] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:58.584 [2024-05-15 04:18:46.574933] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x213b750 00:17:58.584 [2024-05-15 04:18:46.575094] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x213c030 00:17:58.584 [2024-05-15 04:18:46.575111] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x213c030 00:17:58.584 [2024-05-15 04:18:46.575225] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:58.584 pt4 00:17:58.893 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:58.893 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:17:58.893 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:17:58.894 "name": "raid_bdev1", 00:17:58.894 "uuid": "4726f6f2-39c4-4c84-8d93-c699caa50090", 00:17:58.894 "strip_size_kb": 0, 00:17:58.894 "state": "online", 00:17:58.894 "raid_level": "raid1", 00:17:58.894 "superblock": true, 00:17:58.894 "num_base_bdevs": 4, 00:17:58.894 "num_base_bdevs_discovered": 3, 00:17:58.894 "num_base_bdevs_operational": 3, 00:17:58.894 "base_bdevs_list": [ 00:17:58.894 { 00:17:58.894 "name": null, 00:17:58.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.894 "is_configured": false, 00:17:58.894 "data_offset": 2048, 00:17:58.894 "data_size": 63488 00:17:58.894 }, 00:17:58.894 { 00:17:58.894 "name": "pt2", 00:17:58.894 "uuid": "b0897720-3e79-5eae-aa46-62f78ed92740", 00:17:58.894 "is_configured": true, 00:17:58.894 "data_offset": 2048, 00:17:58.894 "data_size": 63488 00:17:58.894 }, 00:17:58.894 { 00:17:58.894 "name": "pt3", 00:17:58.894 "uuid": "cb9f3478-2464-55c3-8e94-1f0b9dfbeeb0", 00:17:58.894 "is_configured": true, 00:17:58.894 "data_offset": 2048, 00:17:58.894 "data_size": 63488 00:17:58.894 }, 00:17:58.894 { 00:17:58.894 "name": "pt4", 00:17:58.894 "uuid": "5a079308-2db7-5bea-9c7d-4fde0b5606e5", 00:17:58.894 "is_configured": true, 00:17:58.894 "data_offset": 2048, 00:17:58.894 "data_size": 63488 00:17:58.894 } 00:17:58.894 ] 00:17:58.894 }' 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:17:58.894 04:18:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.458 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:59.458 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:59.716 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:17:59.716 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:59.716 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:17:59.973 [2024-05-15 04:18:47.941738] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@558 -- # '[' 4726f6f2-39c4-4c84-8d93-c699caa50090 '!=' 4726f6f2-39c4-4c84-8d93-c699caa50090 ']' 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@563 -- # killprocess 3901856 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@946 -- # '[' -z 3901856 ']' 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # kill -0 3901856 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # uname 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3901856 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3901856' 00:17:59.973 killing process with pid 3901856 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@965 -- # kill 3901856 00:17:59.973 [2024-05-15 04:18:47.985079] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:59.973 04:18:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@970 -- # wait 3901856 00:17:59.973 [2024-05-15 04:18:47.985166] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:59.973 [2024-05-15 04:18:47.985255] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:59.973 [2024-05-15 04:18:47.985268] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x213c030 name raid_bdev1, state offline 00:18:00.231 [2024-05-15 04:18:48.026469] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:00.489 04:18:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@565 -- # return 0 00:18:00.489 00:18:00.489 real 0m25.043s 00:18:00.489 user 0m47.046s 00:18:00.489 sys 0m3.407s 00:18:00.489 04:18:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:00.489 04:18:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.489 ************************************ 00:18:00.489 END TEST raid_superblock_test 00:18:00.489 ************************************ 00:18:00.489 04:18:48 bdev_raid -- bdev/bdev_raid.sh@809 -- # '[' true = true ']' 00:18:00.489 04:18:48 bdev_raid -- bdev/bdev_raid.sh@810 -- # for n in 2 4 00:18:00.489 04:18:48 bdev_raid -- bdev/bdev_raid.sh@811 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:18:00.489 04:18:48 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:18:00.489 04:18:48 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:00.489 04:18:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:00.489 ************************************ 00:18:00.489 START TEST raid_rebuild_test 00:18:00.489 ************************************ 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false false true 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local superblock=false 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local verify=true 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local strip_size 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local create_arg 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local data_offset 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # '[' false = true ']' 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # raid_pid=3905300 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@598 -- # waitforlisten 3905300 /var/tmp/spdk-raid.sock 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 3905300 ']' 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:00.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:00.489 04:18:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.489 [2024-05-15 04:18:48.400018] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:18:00.489 [2024-05-15 04:18:48.400093] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3905300 ] 00:18:00.489 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:00.489 Zero copy mechanism will not be used. 00:18:00.489 [2024-05-15 04:18:48.476487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:00.748 [2024-05-15 04:18:48.587023] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:00.748 [2024-05-15 04:18:48.665657] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:00.748 [2024-05-15 04:18:48.665700] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:01.682 04:18:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:01.682 04:18:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:18:01.682 04:18:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:01.682 04:18:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:01.682 BaseBdev1_malloc 00:18:01.682 04:18:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:01.940 [2024-05-15 04:18:49.956086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:01.940 [2024-05-15 04:18:49.956153] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:01.940 [2024-05-15 04:18:49.956188] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x188c000 00:18:01.940 [2024-05-15 04:18:49.956205] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:02.198 [2024-05-15 04:18:49.958104] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:02.198 [2024-05-15 04:18:49.958134] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:02.198 BaseBdev1 00:18:02.198 04:18:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:02.198 04:18:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:02.456 BaseBdev2_malloc 00:18:02.456 04:18:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:02.714 [2024-05-15 04:18:50.529486] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:02.714 [2024-05-15 04:18:50.529554] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:02.714 [2024-05-15 04:18:50.529584] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a372c0 00:18:02.714 [2024-05-15 04:18:50.529600] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:02.715 [2024-05-15 04:18:50.531387] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:02.715 [2024-05-15 04:18:50.531417] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:02.715 BaseBdev2 00:18:02.715 04:18:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:02.972 spare_malloc 00:18:02.972 04:18:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:03.231 spare_delay 00:18:03.231 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:03.489 [2024-05-15 04:18:51.315337] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:03.489 [2024-05-15 04:18:51.315401] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:03.489 [2024-05-15 04:18:51.315424] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a3b100 00:18:03.489 [2024-05-15 04:18:51.315437] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:03.489 [2024-05-15 04:18:51.316966] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:03.489 [2024-05-15 04:18:51.316991] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:03.489 spare 00:18:03.489 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:03.747 [2024-05-15 04:18:51.600136] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:03.747 [2024-05-15 04:18:51.601614] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:03.747 [2024-05-15 04:18:51.601723] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3b930 00:18:03.747 [2024-05-15 04:18:51.601741] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:03.747 [2024-05-15 04:18:51.601998] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a37130 00:18:03.747 [2024-05-15 04:18:51.602190] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3b930 00:18:03.747 [2024-05-15 04:18:51.602206] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a3b930 00:18:03.747 [2024-05-15 04:18:51.602377] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:03.747 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.748 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:04.006 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:04.006 "name": "raid_bdev1", 00:18:04.006 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:04.006 "strip_size_kb": 0, 00:18:04.006 "state": "online", 00:18:04.006 "raid_level": "raid1", 00:18:04.006 "superblock": false, 00:18:04.006 "num_base_bdevs": 2, 00:18:04.006 "num_base_bdevs_discovered": 2, 00:18:04.006 "num_base_bdevs_operational": 2, 00:18:04.006 "base_bdevs_list": [ 00:18:04.006 { 00:18:04.006 "name": "BaseBdev1", 00:18:04.006 "uuid": "9980e10c-d4aa-5122-92bc-f07f3eeebc9b", 00:18:04.006 "is_configured": true, 00:18:04.006 "data_offset": 0, 00:18:04.006 "data_size": 65536 00:18:04.006 }, 00:18:04.006 { 00:18:04.006 "name": "BaseBdev2", 00:18:04.006 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:04.006 "is_configured": true, 00:18:04.006 "data_offset": 0, 00:18:04.006 "data_size": 65536 00:18:04.006 } 00:18:04.006 ] 00:18:04.006 }' 00:18:04.006 04:18:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:04.006 04:18:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.572 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:04.572 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:18:04.830 [2024-05-15 04:18:52.715261] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:04.830 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=65536 00:18:04.830 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.830 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # data_offset=0 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:05.089 04:18:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:05.348 [2024-05-15 04:18:53.260549] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3d530 00:18:05.348 /dev/nbd0 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:05.348 1+0 records in 00:18:05.348 1+0 records out 00:18:05.348 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000170303 s, 24.1 MB/s 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:18:05.348 04:18:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:18:10.609 65536+0 records in 00:18:10.609 65536+0 records out 00:18:10.609 33554432 bytes (34 MB, 32 MiB) copied, 4.83797 s, 6.9 MB/s 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:10.610 [2024-05-15 04:18:58.435734] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:10.610 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:10.868 [2024-05-15 04:18:58.706031] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.868 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:11.126 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:11.126 "name": "raid_bdev1", 00:18:11.126 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:11.126 "strip_size_kb": 0, 00:18:11.126 "state": "online", 00:18:11.126 "raid_level": "raid1", 00:18:11.126 "superblock": false, 00:18:11.126 "num_base_bdevs": 2, 00:18:11.126 "num_base_bdevs_discovered": 1, 00:18:11.126 "num_base_bdevs_operational": 1, 00:18:11.126 "base_bdevs_list": [ 00:18:11.126 { 00:18:11.126 "name": null, 00:18:11.126 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.126 "is_configured": false, 00:18:11.126 "data_offset": 0, 00:18:11.126 "data_size": 65536 00:18:11.126 }, 00:18:11.126 { 00:18:11.126 "name": "BaseBdev2", 00:18:11.126 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:11.126 "is_configured": true, 00:18:11.126 "data_offset": 0, 00:18:11.126 "data_size": 65536 00:18:11.126 } 00:18:11.126 ] 00:18:11.126 }' 00:18:11.126 04:18:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:11.126 04:18:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:11.691 04:18:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:11.949 [2024-05-15 04:18:59.760862] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:11.949 [2024-05-15 04:18:59.767097] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3da10 00:18:11.949 [2024-05-15 04:18:59.769158] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:11.949 04:18:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@647 -- # sleep 1 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.882 04:19:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.139 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:13.139 "name": "raid_bdev1", 00:18:13.139 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:13.139 "strip_size_kb": 0, 00:18:13.139 "state": "online", 00:18:13.139 "raid_level": "raid1", 00:18:13.139 "superblock": false, 00:18:13.139 "num_base_bdevs": 2, 00:18:13.139 "num_base_bdevs_discovered": 2, 00:18:13.139 "num_base_bdevs_operational": 2, 00:18:13.139 "process": { 00:18:13.139 "type": "rebuild", 00:18:13.140 "target": "spare", 00:18:13.140 "progress": { 00:18:13.140 "blocks": 24576, 00:18:13.140 "percent": 37 00:18:13.140 } 00:18:13.140 }, 00:18:13.140 "base_bdevs_list": [ 00:18:13.140 { 00:18:13.140 "name": "spare", 00:18:13.140 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:13.140 "is_configured": true, 00:18:13.140 "data_offset": 0, 00:18:13.140 "data_size": 65536 00:18:13.140 }, 00:18:13.140 { 00:18:13.140 "name": "BaseBdev2", 00:18:13.140 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:13.140 "is_configured": true, 00:18:13.140 "data_offset": 0, 00:18:13.140 "data_size": 65536 00:18:13.140 } 00:18:13.140 ] 00:18:13.140 }' 00:18:13.140 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:13.140 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:13.140 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:13.140 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:13.140 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:13.398 [2024-05-15 04:19:01.339276] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:13.398 [2024-05-15 04:19:01.382392] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:13.398 [2024-05-15 04:19:01.382449] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:13.398 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.655 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:13.655 "name": "raid_bdev1", 00:18:13.655 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:13.655 "strip_size_kb": 0, 00:18:13.655 "state": "online", 00:18:13.655 "raid_level": "raid1", 00:18:13.655 "superblock": false, 00:18:13.655 "num_base_bdevs": 2, 00:18:13.655 "num_base_bdevs_discovered": 1, 00:18:13.655 "num_base_bdevs_operational": 1, 00:18:13.655 "base_bdevs_list": [ 00:18:13.655 { 00:18:13.655 "name": null, 00:18:13.655 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.655 "is_configured": false, 00:18:13.655 "data_offset": 0, 00:18:13.655 "data_size": 65536 00:18:13.655 }, 00:18:13.655 { 00:18:13.655 "name": "BaseBdev2", 00:18:13.655 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:13.655 "is_configured": true, 00:18:13.655 "data_offset": 0, 00:18:13.655 "data_size": 65536 00:18:13.655 } 00:18:13.656 ] 00:18:13.656 }' 00:18:13.656 04:19:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:13.656 04:19:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:14.589 "name": "raid_bdev1", 00:18:14.589 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:14.589 "strip_size_kb": 0, 00:18:14.589 "state": "online", 00:18:14.589 "raid_level": "raid1", 00:18:14.589 "superblock": false, 00:18:14.589 "num_base_bdevs": 2, 00:18:14.589 "num_base_bdevs_discovered": 1, 00:18:14.589 "num_base_bdevs_operational": 1, 00:18:14.589 "base_bdevs_list": [ 00:18:14.589 { 00:18:14.589 "name": null, 00:18:14.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:14.589 "is_configured": false, 00:18:14.589 "data_offset": 0, 00:18:14.589 "data_size": 65536 00:18:14.589 }, 00:18:14.589 { 00:18:14.589 "name": "BaseBdev2", 00:18:14.589 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:14.589 "is_configured": true, 00:18:14.589 "data_offset": 0, 00:18:14.589 "data_size": 65536 00:18:14.589 } 00:18:14.589 ] 00:18:14.589 }' 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:14.589 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:14.847 [2024-05-15 04:19:02.787574] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:14.847 [2024-05-15 04:19:02.793315] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a308e0 00:18:14.847 [2024-05-15 04:19:02.794684] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:14.847 04:19:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # sleep 1 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.220 04:19:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:16.220 "name": "raid_bdev1", 00:18:16.220 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:16.220 "strip_size_kb": 0, 00:18:16.220 "state": "online", 00:18:16.220 "raid_level": "raid1", 00:18:16.220 "superblock": false, 00:18:16.220 "num_base_bdevs": 2, 00:18:16.220 "num_base_bdevs_discovered": 2, 00:18:16.220 "num_base_bdevs_operational": 2, 00:18:16.220 "process": { 00:18:16.220 "type": "rebuild", 00:18:16.220 "target": "spare", 00:18:16.220 "progress": { 00:18:16.220 "blocks": 24576, 00:18:16.220 "percent": 37 00:18:16.220 } 00:18:16.220 }, 00:18:16.220 "base_bdevs_list": [ 00:18:16.220 { 00:18:16.220 "name": "spare", 00:18:16.220 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:16.220 "is_configured": true, 00:18:16.220 "data_offset": 0, 00:18:16.220 "data_size": 65536 00:18:16.220 }, 00:18:16.220 { 00:18:16.220 "name": "BaseBdev2", 00:18:16.220 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:16.220 "is_configured": true, 00:18:16.220 "data_offset": 0, 00:18:16.220 "data_size": 65536 00:18:16.220 } 00:18:16.220 ] 00:18:16.220 }' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@666 -- # '[' false = true ']' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local timeout=620 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.220 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:16.478 "name": "raid_bdev1", 00:18:16.478 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:16.478 "strip_size_kb": 0, 00:18:16.478 "state": "online", 00:18:16.478 "raid_level": "raid1", 00:18:16.478 "superblock": false, 00:18:16.478 "num_base_bdevs": 2, 00:18:16.478 "num_base_bdevs_discovered": 2, 00:18:16.478 "num_base_bdevs_operational": 2, 00:18:16.478 "process": { 00:18:16.478 "type": "rebuild", 00:18:16.478 "target": "spare", 00:18:16.478 "progress": { 00:18:16.478 "blocks": 30720, 00:18:16.478 "percent": 46 00:18:16.478 } 00:18:16.478 }, 00:18:16.478 "base_bdevs_list": [ 00:18:16.478 { 00:18:16.478 "name": "spare", 00:18:16.478 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:16.478 "is_configured": true, 00:18:16.478 "data_offset": 0, 00:18:16.478 "data_size": 65536 00:18:16.478 }, 00:18:16.478 { 00:18:16.478 "name": "BaseBdev2", 00:18:16.478 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:16.478 "is_configured": true, 00:18:16.478 "data_offset": 0, 00:18:16.478 "data_size": 65536 00:18:16.478 } 00:18:16.478 ] 00:18:16.478 }' 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:16.478 04:19:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # sleep 1 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:17.866 "name": "raid_bdev1", 00:18:17.866 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:17.866 "strip_size_kb": 0, 00:18:17.866 "state": "online", 00:18:17.866 "raid_level": "raid1", 00:18:17.866 "superblock": false, 00:18:17.866 "num_base_bdevs": 2, 00:18:17.866 "num_base_bdevs_discovered": 2, 00:18:17.866 "num_base_bdevs_operational": 2, 00:18:17.866 "process": { 00:18:17.866 "type": "rebuild", 00:18:17.866 "target": "spare", 00:18:17.866 "progress": { 00:18:17.866 "blocks": 57344, 00:18:17.866 "percent": 87 00:18:17.866 } 00:18:17.866 }, 00:18:17.866 "base_bdevs_list": [ 00:18:17.866 { 00:18:17.866 "name": "spare", 00:18:17.866 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:17.866 "is_configured": true, 00:18:17.866 "data_offset": 0, 00:18:17.866 "data_size": 65536 00:18:17.866 }, 00:18:17.866 { 00:18:17.866 "name": "BaseBdev2", 00:18:17.866 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:17.866 "is_configured": true, 00:18:17.866 "data_offset": 0, 00:18:17.866 "data_size": 65536 00:18:17.866 } 00:18:17.866 ] 00:18:17.866 }' 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:17.866 04:19:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # sleep 1 00:18:18.124 [2024-05-15 04:19:06.021177] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:18.124 [2024-05-15 04:19:06.021242] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:18.124 [2024-05-15 04:19:06.021295] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.058 04:19:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.058 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:19.058 "name": "raid_bdev1", 00:18:19.058 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:19.058 "strip_size_kb": 0, 00:18:19.058 "state": "online", 00:18:19.058 "raid_level": "raid1", 00:18:19.058 "superblock": false, 00:18:19.058 "num_base_bdevs": 2, 00:18:19.058 "num_base_bdevs_discovered": 2, 00:18:19.058 "num_base_bdevs_operational": 2, 00:18:19.058 "base_bdevs_list": [ 00:18:19.058 { 00:18:19.058 "name": "spare", 00:18:19.058 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:19.058 "is_configured": true, 00:18:19.058 "data_offset": 0, 00:18:19.058 "data_size": 65536 00:18:19.058 }, 00:18:19.058 { 00:18:19.058 "name": "BaseBdev2", 00:18:19.058 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:19.058 "is_configured": true, 00:18:19.058 "data_offset": 0, 00:18:19.058 "data_size": 65536 00:18:19.058 } 00:18:19.058 ] 00:18:19.058 }' 00:18:19.058 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:19.058 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:19.058 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@709 -- # break 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.317 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:19.575 "name": "raid_bdev1", 00:18:19.575 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:19.575 "strip_size_kb": 0, 00:18:19.575 "state": "online", 00:18:19.575 "raid_level": "raid1", 00:18:19.575 "superblock": false, 00:18:19.575 "num_base_bdevs": 2, 00:18:19.575 "num_base_bdevs_discovered": 2, 00:18:19.575 "num_base_bdevs_operational": 2, 00:18:19.575 "base_bdevs_list": [ 00:18:19.575 { 00:18:19.575 "name": "spare", 00:18:19.575 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:19.575 "is_configured": true, 00:18:19.575 "data_offset": 0, 00:18:19.575 "data_size": 65536 00:18:19.575 }, 00:18:19.575 { 00:18:19.575 "name": "BaseBdev2", 00:18:19.575 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:19.575 "is_configured": true, 00:18:19.575 "data_offset": 0, 00:18:19.575 "data_size": 65536 00:18:19.575 } 00:18:19.575 ] 00:18:19.575 }' 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.575 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:19.833 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:19.833 "name": "raid_bdev1", 00:18:19.833 "uuid": "c50f6f05-b2e8-4aad-9c5f-202a6f2843b1", 00:18:19.833 "strip_size_kb": 0, 00:18:19.833 "state": "online", 00:18:19.833 "raid_level": "raid1", 00:18:19.833 "superblock": false, 00:18:19.833 "num_base_bdevs": 2, 00:18:19.833 "num_base_bdevs_discovered": 2, 00:18:19.833 "num_base_bdevs_operational": 2, 00:18:19.833 "base_bdevs_list": [ 00:18:19.833 { 00:18:19.833 "name": "spare", 00:18:19.833 "uuid": "f6dcc76a-0eb8-527d-9f92-729c777fc79b", 00:18:19.833 "is_configured": true, 00:18:19.833 "data_offset": 0, 00:18:19.833 "data_size": 65536 00:18:19.833 }, 00:18:19.833 { 00:18:19.833 "name": "BaseBdev2", 00:18:19.833 "uuid": "7fc1a0f4-0cae-5668-b885-225b70c489a7", 00:18:19.833 "is_configured": true, 00:18:19.833 "data_offset": 0, 00:18:19.833 "data_size": 65536 00:18:19.833 } 00:18:19.833 ] 00:18:19.833 }' 00:18:19.833 04:19:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:19.833 04:19:07 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.398 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:20.656 [2024-05-15 04:19:08.473506] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:20.656 [2024-05-15 04:19:08.473541] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:20.656 [2024-05-15 04:19:08.473621] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:20.656 [2024-05-15 04:19:08.473689] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:20.656 [2024-05-15 04:19:08.473705] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3b930 name raid_bdev1, state offline 00:18:20.656 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.656 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # jq length 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:20.914 04:19:08 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:21.171 /dev/nbd0 00:18:21.171 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:21.171 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:21.171 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:18:21.171 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:18:21.171 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:21.172 1+0 records in 00:18:21.172 1+0 records out 00:18:21.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000152626 s, 26.8 MB/s 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:21.172 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:21.430 /dev/nbd1 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:21.430 1+0 records in 00:18:21.430 1+0 records out 00:18:21.430 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000248829 s, 16.5 MB/s 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.430 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:21.431 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:21.688 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:21.688 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:21.688 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:21.688 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:21.689 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # '[' false = true ']' 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@783 -- # killprocess 3905300 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 3905300 ']' 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 3905300 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3905300 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3905300' 00:18:21.947 killing process with pid 3905300 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 3905300 00:18:21.947 Received shutdown signal, test time was about 60.000000 seconds 00:18:21.947 00:18:21.947 Latency(us) 00:18:21.947 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:21.947 =================================================================================================================== 00:18:21.947 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:21.947 [2024-05-15 04:19:09.938425] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:21.947 04:19:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 3905300 00:18:22.204 [2024-05-15 04:19:09.974515] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@785 -- # return 0 00:18:22.469 00:18:22.469 real 0m21.914s 00:18:22.469 user 0m30.686s 00:18:22.469 sys 0m4.184s 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.469 ************************************ 00:18:22.469 END TEST raid_rebuild_test 00:18:22.469 ************************************ 00:18:22.469 04:19:10 bdev_raid -- bdev/bdev_raid.sh@812 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:18:22.469 04:19:10 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:18:22.469 04:19:10 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:22.469 04:19:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:22.469 ************************************ 00:18:22.469 START TEST raid_rebuild_test_sb 00:18:22.469 ************************************ 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local verify=true 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local strip_size 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local create_arg 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local data_offset 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # raid_pid=3908059 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # waitforlisten 3908059 /var/tmp/spdk-raid.sock 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3908059 ']' 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:22.469 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:22.469 04:19:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.469 [2024-05-15 04:19:10.375424] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:18:22.469 [2024-05-15 04:19:10.375511] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3908059 ] 00:18:22.469 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:22.469 Zero copy mechanism will not be used. 00:18:22.469 [2024-05-15 04:19:10.461458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:22.788 [2024-05-15 04:19:10.587367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:22.788 [2024-05-15 04:19:10.659267] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:22.788 [2024-05-15 04:19:10.659315] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.376 04:19:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:23.376 04:19:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:18:23.376 04:19:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:23.376 04:19:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:23.634 BaseBdev1_malloc 00:18:23.634 04:19:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:23.891 [2024-05-15 04:19:11.836218] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:23.891 [2024-05-15 04:19:11.836294] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:23.891 [2024-05-15 04:19:11.836331] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x117c000 00:18:23.891 [2024-05-15 04:19:11.836348] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:23.891 [2024-05-15 04:19:11.838301] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:23.891 [2024-05-15 04:19:11.838332] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:23.891 BaseBdev1 00:18:23.891 04:19:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:23.891 04:19:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:24.148 BaseBdev2_malloc 00:18:24.148 04:19:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:24.406 [2024-05-15 04:19:12.316665] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:24.406 [2024-05-15 04:19:12.316732] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:24.406 [2024-05-15 04:19:12.316756] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13272c0 00:18:24.406 [2024-05-15 04:19:12.316772] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:24.406 [2024-05-15 04:19:12.318152] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:24.406 [2024-05-15 04:19:12.318181] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:24.406 BaseBdev2 00:18:24.406 04:19:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:24.664 spare_malloc 00:18:24.664 04:19:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:24.922 spare_delay 00:18:24.922 04:19:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:25.180 [2024-05-15 04:19:13.065855] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:25.180 [2024-05-15 04:19:13.065935] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:25.180 [2024-05-15 04:19:13.065961] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132b100 00:18:25.180 [2024-05-15 04:19:13.065977] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:25.180 [2024-05-15 04:19:13.067631] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:25.180 [2024-05-15 04:19:13.067662] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:25.180 spare 00:18:25.180 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:18:25.437 [2024-05-15 04:19:13.302528] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:25.437 [2024-05-15 04:19:13.303932] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:25.437 [2024-05-15 04:19:13.304143] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x132b930 00:18:25.437 [2024-05-15 04:19:13.304161] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:25.437 [2024-05-15 04:19:13.304401] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x117b0f0 00:18:25.437 [2024-05-15 04:19:13.304588] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x132b930 00:18:25.437 [2024-05-15 04:19:13.304614] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x132b930 00:18:25.438 [2024-05-15 04:19:13.304759] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.438 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:25.695 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:25.695 "name": "raid_bdev1", 00:18:25.695 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:25.695 "strip_size_kb": 0, 00:18:25.695 "state": "online", 00:18:25.695 "raid_level": "raid1", 00:18:25.695 "superblock": true, 00:18:25.695 "num_base_bdevs": 2, 00:18:25.695 "num_base_bdevs_discovered": 2, 00:18:25.695 "num_base_bdevs_operational": 2, 00:18:25.695 "base_bdevs_list": [ 00:18:25.695 { 00:18:25.695 "name": "BaseBdev1", 00:18:25.695 "uuid": "2406264d-e827-51e9-a24e-b41a00b85ee3", 00:18:25.695 "is_configured": true, 00:18:25.695 "data_offset": 2048, 00:18:25.695 "data_size": 63488 00:18:25.695 }, 00:18:25.695 { 00:18:25.695 "name": "BaseBdev2", 00:18:25.695 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:25.695 "is_configured": true, 00:18:25.695 "data_offset": 2048, 00:18:25.695 "data_size": 63488 00:18:25.695 } 00:18:25.695 ] 00:18:25.695 }' 00:18:25.695 04:19:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:25.695 04:19:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:26.260 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:26.260 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:18:26.517 [2024-05-15 04:19:14.377544] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:26.517 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=63488 00:18:26.517 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.517 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # data_offset=2048 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:26.774 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:18:27.031 [2024-05-15 04:19:14.918836] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x117b600 00:18:27.031 /dev/nbd0 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:27.031 1+0 records in 00:18:27.031 1+0 records out 00:18:27.031 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000177714 s, 23.0 MB/s 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:18:27.031 04:19:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:18:32.287 63488+0 records in 00:18:32.287 63488+0 records out 00:18:32.287 32505856 bytes (33 MB, 31 MiB) copied, 4.5951 s, 7.1 MB/s 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:32.287 04:19:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:18:32.287 [2024-05-15 04:19:19.817386] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.287 [2024-05-15 04:19:20.054078] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.287 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:32.545 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:32.545 "name": "raid_bdev1", 00:18:32.545 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:32.545 "strip_size_kb": 0, 00:18:32.545 "state": "online", 00:18:32.545 "raid_level": "raid1", 00:18:32.545 "superblock": true, 00:18:32.545 "num_base_bdevs": 2, 00:18:32.545 "num_base_bdevs_discovered": 1, 00:18:32.545 "num_base_bdevs_operational": 1, 00:18:32.545 "base_bdevs_list": [ 00:18:32.545 { 00:18:32.545 "name": null, 00:18:32.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.545 "is_configured": false, 00:18:32.545 "data_offset": 2048, 00:18:32.545 "data_size": 63488 00:18:32.545 }, 00:18:32.545 { 00:18:32.545 "name": "BaseBdev2", 00:18:32.545 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:32.545 "is_configured": true, 00:18:32.545 "data_offset": 2048, 00:18:32.545 "data_size": 63488 00:18:32.545 } 00:18:32.545 ] 00:18:32.545 }' 00:18:32.545 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:32.545 04:19:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:33.127 04:19:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:33.385 [2024-05-15 04:19:21.193289] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:33.385 [2024-05-15 04:19:21.199892] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132b390 00:18:33.385 [2024-05-15 04:19:21.201777] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:33.385 04:19:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@647 -- # sleep 1 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.320 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:34.578 "name": "raid_bdev1", 00:18:34.578 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:34.578 "strip_size_kb": 0, 00:18:34.578 "state": "online", 00:18:34.578 "raid_level": "raid1", 00:18:34.578 "superblock": true, 00:18:34.578 "num_base_bdevs": 2, 00:18:34.578 "num_base_bdevs_discovered": 2, 00:18:34.578 "num_base_bdevs_operational": 2, 00:18:34.578 "process": { 00:18:34.578 "type": "rebuild", 00:18:34.578 "target": "spare", 00:18:34.578 "progress": { 00:18:34.578 "blocks": 24576, 00:18:34.578 "percent": 38 00:18:34.578 } 00:18:34.578 }, 00:18:34.578 "base_bdevs_list": [ 00:18:34.578 { 00:18:34.578 "name": "spare", 00:18:34.578 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:34.578 "is_configured": true, 00:18:34.578 "data_offset": 2048, 00:18:34.578 "data_size": 63488 00:18:34.578 }, 00:18:34.578 { 00:18:34.578 "name": "BaseBdev2", 00:18:34.578 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:34.578 "is_configured": true, 00:18:34.578 "data_offset": 2048, 00:18:34.578 "data_size": 63488 00:18:34.578 } 00:18:34.578 ] 00:18:34.578 }' 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:34.578 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:34.836 [2024-05-15 04:19:22.771759] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:34.836 [2024-05-15 04:19:22.815052] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:34.836 [2024-05-15 04:19:22.815120] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:34.836 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:34.837 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:34.837 04:19:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.095 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:35.095 "name": "raid_bdev1", 00:18:35.095 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:35.095 "strip_size_kb": 0, 00:18:35.095 "state": "online", 00:18:35.095 "raid_level": "raid1", 00:18:35.095 "superblock": true, 00:18:35.095 "num_base_bdevs": 2, 00:18:35.095 "num_base_bdevs_discovered": 1, 00:18:35.095 "num_base_bdevs_operational": 1, 00:18:35.095 "base_bdevs_list": [ 00:18:35.095 { 00:18:35.095 "name": null, 00:18:35.095 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.096 "is_configured": false, 00:18:35.096 "data_offset": 2048, 00:18:35.096 "data_size": 63488 00:18:35.096 }, 00:18:35.096 { 00:18:35.096 "name": "BaseBdev2", 00:18:35.096 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:35.096 "is_configured": true, 00:18:35.096 "data_offset": 2048, 00:18:35.096 "data_size": 63488 00:18:35.096 } 00:18:35.096 ] 00:18:35.096 }' 00:18:35.096 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:35.096 04:19:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.664 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:35.925 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:35.925 "name": "raid_bdev1", 00:18:35.925 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:35.925 "strip_size_kb": 0, 00:18:35.925 "state": "online", 00:18:35.925 "raid_level": "raid1", 00:18:35.925 "superblock": true, 00:18:35.925 "num_base_bdevs": 2, 00:18:35.925 "num_base_bdevs_discovered": 1, 00:18:35.925 "num_base_bdevs_operational": 1, 00:18:35.925 "base_bdevs_list": [ 00:18:35.925 { 00:18:35.925 "name": null, 00:18:35.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.925 "is_configured": false, 00:18:35.925 "data_offset": 2048, 00:18:35.925 "data_size": 63488 00:18:35.925 }, 00:18:35.925 { 00:18:35.925 "name": "BaseBdev2", 00:18:35.925 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:35.925 "is_configured": true, 00:18:35.925 "data_offset": 2048, 00:18:35.925 "data_size": 63488 00:18:35.925 } 00:18:35.925 ] 00:18:35.925 }' 00:18:35.925 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:35.925 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:35.925 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:36.183 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:36.183 04:19:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:36.184 [2024-05-15 04:19:24.176609] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:36.184 [2024-05-15 04:19:24.183593] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x117b920 00:18:36.184 [2024-05-15 04:19:24.185016] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:36.184 04:19:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # sleep 1 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:37.558 "name": "raid_bdev1", 00:18:37.558 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:37.558 "strip_size_kb": 0, 00:18:37.558 "state": "online", 00:18:37.558 "raid_level": "raid1", 00:18:37.558 "superblock": true, 00:18:37.558 "num_base_bdevs": 2, 00:18:37.558 "num_base_bdevs_discovered": 2, 00:18:37.558 "num_base_bdevs_operational": 2, 00:18:37.558 "process": { 00:18:37.558 "type": "rebuild", 00:18:37.558 "target": "spare", 00:18:37.558 "progress": { 00:18:37.558 "blocks": 24576, 00:18:37.558 "percent": 38 00:18:37.558 } 00:18:37.558 }, 00:18:37.558 "base_bdevs_list": [ 00:18:37.558 { 00:18:37.558 "name": "spare", 00:18:37.558 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:37.558 "is_configured": true, 00:18:37.558 "data_offset": 2048, 00:18:37.558 "data_size": 63488 00:18:37.558 }, 00:18:37.558 { 00:18:37.558 "name": "BaseBdev2", 00:18:37.558 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:37.558 "is_configured": true, 00:18:37.558 "data_offset": 2048, 00:18:37.558 "data_size": 63488 00:18:37.558 } 00:18:37.558 ] 00:18:37.558 }' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:18:37.558 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local timeout=641 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.558 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:37.816 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:37.816 "name": "raid_bdev1", 00:18:37.816 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:37.816 "strip_size_kb": 0, 00:18:37.816 "state": "online", 00:18:37.816 "raid_level": "raid1", 00:18:37.816 "superblock": true, 00:18:37.816 "num_base_bdevs": 2, 00:18:37.816 "num_base_bdevs_discovered": 2, 00:18:37.816 "num_base_bdevs_operational": 2, 00:18:37.816 "process": { 00:18:37.816 "type": "rebuild", 00:18:37.816 "target": "spare", 00:18:37.816 "progress": { 00:18:37.816 "blocks": 30720, 00:18:37.816 "percent": 48 00:18:37.816 } 00:18:37.816 }, 00:18:37.816 "base_bdevs_list": [ 00:18:37.816 { 00:18:37.816 "name": "spare", 00:18:37.816 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:37.816 "is_configured": true, 00:18:37.816 "data_offset": 2048, 00:18:37.816 "data_size": 63488 00:18:37.816 }, 00:18:37.816 { 00:18:37.816 "name": "BaseBdev2", 00:18:37.816 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:37.816 "is_configured": true, 00:18:37.816 "data_offset": 2048, 00:18:37.816 "data_size": 63488 00:18:37.816 } 00:18:37.816 ] 00:18:37.816 }' 00:18:37.816 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:38.073 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:38.073 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:38.073 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:38.073 04:19:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # sleep 1 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.029 04:19:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:39.286 "name": "raid_bdev1", 00:18:39.286 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:39.286 "strip_size_kb": 0, 00:18:39.286 "state": "online", 00:18:39.286 "raid_level": "raid1", 00:18:39.286 "superblock": true, 00:18:39.286 "num_base_bdevs": 2, 00:18:39.286 "num_base_bdevs_discovered": 2, 00:18:39.286 "num_base_bdevs_operational": 2, 00:18:39.286 "process": { 00:18:39.286 "type": "rebuild", 00:18:39.286 "target": "spare", 00:18:39.286 "progress": { 00:18:39.286 "blocks": 57344, 00:18:39.286 "percent": 90 00:18:39.286 } 00:18:39.286 }, 00:18:39.286 "base_bdevs_list": [ 00:18:39.286 { 00:18:39.286 "name": "spare", 00:18:39.286 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:39.286 "is_configured": true, 00:18:39.286 "data_offset": 2048, 00:18:39.286 "data_size": 63488 00:18:39.286 }, 00:18:39.286 { 00:18:39.286 "name": "BaseBdev2", 00:18:39.286 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:39.286 "is_configured": true, 00:18:39.286 "data_offset": 2048, 00:18:39.286 "data_size": 63488 00:18:39.286 } 00:18:39.286 ] 00:18:39.286 }' 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:39.286 04:19:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # sleep 1 00:18:39.543 [2024-05-15 04:19:27.310465] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:18:39.543 [2024-05-15 04:19:27.310531] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:18:39.543 [2024-05-15 04:19:27.310646] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:40.474 "name": "raid_bdev1", 00:18:40.474 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:40.474 "strip_size_kb": 0, 00:18:40.474 "state": "online", 00:18:40.474 "raid_level": "raid1", 00:18:40.474 "superblock": true, 00:18:40.474 "num_base_bdevs": 2, 00:18:40.474 "num_base_bdevs_discovered": 2, 00:18:40.474 "num_base_bdevs_operational": 2, 00:18:40.474 "base_bdevs_list": [ 00:18:40.474 { 00:18:40.474 "name": "spare", 00:18:40.474 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:40.474 "is_configured": true, 00:18:40.474 "data_offset": 2048, 00:18:40.474 "data_size": 63488 00:18:40.474 }, 00:18:40.474 { 00:18:40.474 "name": "BaseBdev2", 00:18:40.474 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:40.474 "is_configured": true, 00:18:40.474 "data_offset": 2048, 00:18:40.474 "data_size": 63488 00:18:40.474 } 00:18:40.474 ] 00:18:40.474 }' 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:18:40.474 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@709 -- # break 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.732 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:40.990 "name": "raid_bdev1", 00:18:40.990 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:40.990 "strip_size_kb": 0, 00:18:40.990 "state": "online", 00:18:40.990 "raid_level": "raid1", 00:18:40.990 "superblock": true, 00:18:40.990 "num_base_bdevs": 2, 00:18:40.990 "num_base_bdevs_discovered": 2, 00:18:40.990 "num_base_bdevs_operational": 2, 00:18:40.990 "base_bdevs_list": [ 00:18:40.990 { 00:18:40.990 "name": "spare", 00:18:40.990 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:40.990 "is_configured": true, 00:18:40.990 "data_offset": 2048, 00:18:40.990 "data_size": 63488 00:18:40.990 }, 00:18:40.990 { 00:18:40.990 "name": "BaseBdev2", 00:18:40.990 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:40.990 "is_configured": true, 00:18:40.990 "data_offset": 2048, 00:18:40.990 "data_size": 63488 00:18:40.990 } 00:18:40.990 ] 00:18:40.990 }' 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.990 04:19:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.248 04:19:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:41.248 "name": "raid_bdev1", 00:18:41.248 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:41.248 "strip_size_kb": 0, 00:18:41.248 "state": "online", 00:18:41.248 "raid_level": "raid1", 00:18:41.248 "superblock": true, 00:18:41.248 "num_base_bdevs": 2, 00:18:41.248 "num_base_bdevs_discovered": 2, 00:18:41.248 "num_base_bdevs_operational": 2, 00:18:41.248 "base_bdevs_list": [ 00:18:41.248 { 00:18:41.248 "name": "spare", 00:18:41.248 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:41.248 "is_configured": true, 00:18:41.248 "data_offset": 2048, 00:18:41.248 "data_size": 63488 00:18:41.248 }, 00:18:41.248 { 00:18:41.248 "name": "BaseBdev2", 00:18:41.248 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:41.248 "is_configured": true, 00:18:41.248 "data_offset": 2048, 00:18:41.248 "data_size": 63488 00:18:41.248 } 00:18:41.248 ] 00:18:41.248 }' 00:18:41.248 04:19:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:41.248 04:19:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.813 04:19:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:42.071 [2024-05-15 04:19:29.907838] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:42.071 [2024-05-15 04:19:29.907870] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:42.071 [2024-05-15 04:19:29.907955] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:42.071 [2024-05-15 04:19:29.908025] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:42.071 [2024-05-15 04:19:29.908042] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x132b930 name raid_bdev1, state offline 00:18:42.071 04:19:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.071 04:19:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # jq length 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:42.329 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:18:42.587 /dev/nbd0 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:42.587 1+0 records in 00:18:42.587 1+0 records out 00:18:42.587 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224703 s, 18.2 MB/s 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:42.587 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:18:42.846 /dev/nbd1 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:18:42.846 1+0 records in 00:18:42.846 1+0 records out 00:18:42.846 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249988 s, 16.4 MB/s 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:42.846 04:19:30 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:18:43.104 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:18:43.362 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:43.620 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:43.878 [2024-05-15 04:19:31.757577] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:43.878 [2024-05-15 04:19:31.757637] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:43.878 [2024-05-15 04:19:31.757665] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13269b0 00:18:43.878 [2024-05-15 04:19:31.757678] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:43.878 [2024-05-15 04:19:31.759281] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:43.878 [2024-05-15 04:19:31.759307] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:43.878 [2024-05-15 04:19:31.759412] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:43.878 [2024-05-15 04:19:31.759447] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:43.878 [2024-05-15 04:19:31.759554] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:43.878 spare 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:43.878 04:19:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:43.878 [2024-05-15 04:19:31.859880] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x131ba10 00:18:43.878 [2024-05-15 04:19:31.859902] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:43.878 [2024-05-15 04:19:31.860111] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1326810 00:18:43.878 [2024-05-15 04:19:31.860316] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x131ba10 00:18:43.878 [2024-05-15 04:19:31.860333] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x131ba10 00:18:43.878 [2024-05-15 04:19:31.860469] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:44.136 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:44.136 "name": "raid_bdev1", 00:18:44.136 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:44.136 "strip_size_kb": 0, 00:18:44.136 "state": "online", 00:18:44.136 "raid_level": "raid1", 00:18:44.136 "superblock": true, 00:18:44.136 "num_base_bdevs": 2, 00:18:44.136 "num_base_bdevs_discovered": 2, 00:18:44.136 "num_base_bdevs_operational": 2, 00:18:44.136 "base_bdevs_list": [ 00:18:44.136 { 00:18:44.136 "name": "spare", 00:18:44.136 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:44.136 "is_configured": true, 00:18:44.136 "data_offset": 2048, 00:18:44.136 "data_size": 63488 00:18:44.136 }, 00:18:44.136 { 00:18:44.136 "name": "BaseBdev2", 00:18:44.136 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:44.136 "is_configured": true, 00:18:44.136 "data_offset": 2048, 00:18:44.136 "data_size": 63488 00:18:44.136 } 00:18:44.136 ] 00:18:44.136 }' 00:18:44.136 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:44.136 04:19:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.701 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:44.701 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:44.701 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:44.701 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:44.701 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:44.702 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.702 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:44.959 "name": "raid_bdev1", 00:18:44.959 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:44.959 "strip_size_kb": 0, 00:18:44.959 "state": "online", 00:18:44.959 "raid_level": "raid1", 00:18:44.959 "superblock": true, 00:18:44.959 "num_base_bdevs": 2, 00:18:44.959 "num_base_bdevs_discovered": 2, 00:18:44.959 "num_base_bdevs_operational": 2, 00:18:44.959 "base_bdevs_list": [ 00:18:44.959 { 00:18:44.959 "name": "spare", 00:18:44.959 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:44.959 "is_configured": true, 00:18:44.959 "data_offset": 2048, 00:18:44.959 "data_size": 63488 00:18:44.959 }, 00:18:44.959 { 00:18:44.959 "name": "BaseBdev2", 00:18:44.959 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:44.959 "is_configured": true, 00:18:44.959 "data_offset": 2048, 00:18:44.959 "data_size": 63488 00:18:44.959 } 00:18:44.959 ] 00:18:44.959 }' 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:44.959 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.960 04:19:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:18:45.217 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:18:45.217 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:18:45.480 [2024-05-15 04:19:33.353898] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.480 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.764 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:45.764 "name": "raid_bdev1", 00:18:45.764 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:45.764 "strip_size_kb": 0, 00:18:45.764 "state": "online", 00:18:45.764 "raid_level": "raid1", 00:18:45.764 "superblock": true, 00:18:45.764 "num_base_bdevs": 2, 00:18:45.764 "num_base_bdevs_discovered": 1, 00:18:45.764 "num_base_bdevs_operational": 1, 00:18:45.764 "base_bdevs_list": [ 00:18:45.764 { 00:18:45.764 "name": null, 00:18:45.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.764 "is_configured": false, 00:18:45.764 "data_offset": 2048, 00:18:45.764 "data_size": 63488 00:18:45.764 }, 00:18:45.764 { 00:18:45.764 "name": "BaseBdev2", 00:18:45.764 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:45.764 "is_configured": true, 00:18:45.764 "data_offset": 2048, 00:18:45.764 "data_size": 63488 00:18:45.764 } 00:18:45.764 ] 00:18:45.764 }' 00:18:45.764 04:19:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:45.764 04:19:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:46.330 04:19:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:18:46.589 [2024-05-15 04:19:34.384656] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:46.589 [2024-05-15 04:19:34.384872] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:46.589 [2024-05-15 04:19:34.384894] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:46.589 [2024-05-15 04:19:34.384927] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:46.589 [2024-05-15 04:19:34.391426] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x132cab0 00:18:46.589 [2024-05-15 04:19:34.393599] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:46.589 04:19:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # sleep 1 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.522 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:47.779 "name": "raid_bdev1", 00:18:47.779 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:47.779 "strip_size_kb": 0, 00:18:47.779 "state": "online", 00:18:47.779 "raid_level": "raid1", 00:18:47.779 "superblock": true, 00:18:47.779 "num_base_bdevs": 2, 00:18:47.779 "num_base_bdevs_discovered": 2, 00:18:47.779 "num_base_bdevs_operational": 2, 00:18:47.779 "process": { 00:18:47.779 "type": "rebuild", 00:18:47.779 "target": "spare", 00:18:47.779 "progress": { 00:18:47.779 "blocks": 24576, 00:18:47.779 "percent": 38 00:18:47.779 } 00:18:47.779 }, 00:18:47.779 "base_bdevs_list": [ 00:18:47.779 { 00:18:47.779 "name": "spare", 00:18:47.779 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:47.779 "is_configured": true, 00:18:47.779 "data_offset": 2048, 00:18:47.779 "data_size": 63488 00:18:47.779 }, 00:18:47.779 { 00:18:47.779 "name": "BaseBdev2", 00:18:47.779 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:47.779 "is_configured": true, 00:18:47.779 "data_offset": 2048, 00:18:47.779 "data_size": 63488 00:18:47.779 } 00:18:47.779 ] 00:18:47.779 }' 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:47.779 04:19:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:48.036 [2024-05-15 04:19:36.004325] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:48.037 [2024-05-15 04:19:36.006851] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:48.037 [2024-05-15 04:19:36.006906] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.037 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:48.294 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:48.294 "name": "raid_bdev1", 00:18:48.294 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:48.294 "strip_size_kb": 0, 00:18:48.294 "state": "online", 00:18:48.294 "raid_level": "raid1", 00:18:48.294 "superblock": true, 00:18:48.294 "num_base_bdevs": 2, 00:18:48.294 "num_base_bdevs_discovered": 1, 00:18:48.294 "num_base_bdevs_operational": 1, 00:18:48.294 "base_bdevs_list": [ 00:18:48.294 { 00:18:48.294 "name": null, 00:18:48.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:48.294 "is_configured": false, 00:18:48.294 "data_offset": 2048, 00:18:48.294 "data_size": 63488 00:18:48.294 }, 00:18:48.294 { 00:18:48.294 "name": "BaseBdev2", 00:18:48.294 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:48.294 "is_configured": true, 00:18:48.294 "data_offset": 2048, 00:18:48.294 "data_size": 63488 00:18:48.294 } 00:18:48.294 ] 00:18:48.294 }' 00:18:48.294 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:48.294 04:19:36 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.861 04:19:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:18:49.119 [2024-05-15 04:19:37.059251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:18:49.119 [2024-05-15 04:19:37.059333] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.119 [2024-05-15 04:19:37.059374] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x132a350 00:18:49.119 [2024-05-15 04:19:37.059392] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.119 [2024-05-15 04:19:37.059870] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.119 [2024-05-15 04:19:37.059899] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:18:49.119 [2024-05-15 04:19:37.060004] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:18:49.119 [2024-05-15 04:19:37.060025] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:18:49.119 [2024-05-15 04:19:37.060036] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:18:49.119 [2024-05-15 04:19:37.060062] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:18:49.119 [2024-05-15 04:19:37.066358] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe83b90 00:18:49.119 spare 00:18:49.119 [2024-05-15 04:19:37.067941] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:18:49.119 04:19:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # sleep 1 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:50.492 "name": "raid_bdev1", 00:18:50.492 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:50.492 "strip_size_kb": 0, 00:18:50.492 "state": "online", 00:18:50.492 "raid_level": "raid1", 00:18:50.492 "superblock": true, 00:18:50.492 "num_base_bdevs": 2, 00:18:50.492 "num_base_bdevs_discovered": 2, 00:18:50.492 "num_base_bdevs_operational": 2, 00:18:50.492 "process": { 00:18:50.492 "type": "rebuild", 00:18:50.492 "target": "spare", 00:18:50.492 "progress": { 00:18:50.492 "blocks": 24576, 00:18:50.492 "percent": 38 00:18:50.492 } 00:18:50.492 }, 00:18:50.492 "base_bdevs_list": [ 00:18:50.492 { 00:18:50.492 "name": "spare", 00:18:50.492 "uuid": "c7a7b119-f067-5373-b7f6-f01091df95d0", 00:18:50.492 "is_configured": true, 00:18:50.492 "data_offset": 2048, 00:18:50.492 "data_size": 63488 00:18:50.492 }, 00:18:50.492 { 00:18:50.492 "name": "BaseBdev2", 00:18:50.492 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:50.492 "is_configured": true, 00:18:50.492 "data_offset": 2048, 00:18:50.492 "data_size": 63488 00:18:50.492 } 00:18:50.492 ] 00:18:50.492 }' 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:18:50.492 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:18:50.751 [2024-05-15 04:19:38.678820] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:18:50.751 [2024-05-15 04:19:38.681367] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:18:50.751 [2024-05-15 04:19:38.681425] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.751 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.009 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:51.009 "name": "raid_bdev1", 00:18:51.009 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:51.009 "strip_size_kb": 0, 00:18:51.009 "state": "online", 00:18:51.009 "raid_level": "raid1", 00:18:51.009 "superblock": true, 00:18:51.009 "num_base_bdevs": 2, 00:18:51.009 "num_base_bdevs_discovered": 1, 00:18:51.009 "num_base_bdevs_operational": 1, 00:18:51.009 "base_bdevs_list": [ 00:18:51.009 { 00:18:51.009 "name": null, 00:18:51.009 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.009 "is_configured": false, 00:18:51.009 "data_offset": 2048, 00:18:51.009 "data_size": 63488 00:18:51.009 }, 00:18:51.009 { 00:18:51.009 "name": "BaseBdev2", 00:18:51.009 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:51.009 "is_configured": true, 00:18:51.009 "data_offset": 2048, 00:18:51.009 "data_size": 63488 00:18:51.009 } 00:18:51.009 ] 00:18:51.009 }' 00:18:51.009 04:19:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:51.009 04:19:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.575 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.833 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:51.833 "name": "raid_bdev1", 00:18:51.833 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:51.833 "strip_size_kb": 0, 00:18:51.833 "state": "online", 00:18:51.833 "raid_level": "raid1", 00:18:51.833 "superblock": true, 00:18:51.833 "num_base_bdevs": 2, 00:18:51.833 "num_base_bdevs_discovered": 1, 00:18:51.833 "num_base_bdevs_operational": 1, 00:18:51.833 "base_bdevs_list": [ 00:18:51.833 { 00:18:51.833 "name": null, 00:18:51.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.833 "is_configured": false, 00:18:51.833 "data_offset": 2048, 00:18:51.833 "data_size": 63488 00:18:51.833 }, 00:18:51.833 { 00:18:51.833 "name": "BaseBdev2", 00:18:51.833 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:51.833 "is_configured": true, 00:18:51.833 "data_offset": 2048, 00:18:51.833 "data_size": 63488 00:18:51.833 } 00:18:51.833 ] 00:18:51.833 }' 00:18:51.833 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:51.833 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:51.833 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:52.091 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:52.091 04:19:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:18:52.349 04:19:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:52.349 [2024-05-15 04:19:40.339630] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:52.349 [2024-05-15 04:19:40.339703] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.349 [2024-05-15 04:19:40.339730] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x131bd90 00:18:52.349 [2024-05-15 04:19:40.339752] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.349 [2024-05-15 04:19:40.340159] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.349 [2024-05-15 04:19:40.340182] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:52.349 [2024-05-15 04:19:40.340262] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:18:52.349 [2024-05-15 04:19:40.340279] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:52.349 [2024-05-15 04:19:40.340288] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:52.349 BaseBdev1 00:18:52.349 04:19:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # sleep 1 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:53.722 "name": "raid_bdev1", 00:18:53.722 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:53.722 "strip_size_kb": 0, 00:18:53.722 "state": "online", 00:18:53.722 "raid_level": "raid1", 00:18:53.722 "superblock": true, 00:18:53.722 "num_base_bdevs": 2, 00:18:53.722 "num_base_bdevs_discovered": 1, 00:18:53.722 "num_base_bdevs_operational": 1, 00:18:53.722 "base_bdevs_list": [ 00:18:53.722 { 00:18:53.722 "name": null, 00:18:53.722 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.722 "is_configured": false, 00:18:53.722 "data_offset": 2048, 00:18:53.722 "data_size": 63488 00:18:53.722 }, 00:18:53.722 { 00:18:53.722 "name": "BaseBdev2", 00:18:53.722 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:53.722 "is_configured": true, 00:18:53.722 "data_offset": 2048, 00:18:53.722 "data_size": 63488 00:18:53.722 } 00:18:53.722 ] 00:18:53.722 }' 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:53.722 04:19:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.288 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:54.545 "name": "raid_bdev1", 00:18:54.545 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:54.545 "strip_size_kb": 0, 00:18:54.545 "state": "online", 00:18:54.545 "raid_level": "raid1", 00:18:54.545 "superblock": true, 00:18:54.545 "num_base_bdevs": 2, 00:18:54.545 "num_base_bdevs_discovered": 1, 00:18:54.545 "num_base_bdevs_operational": 1, 00:18:54.545 "base_bdevs_list": [ 00:18:54.545 { 00:18:54.545 "name": null, 00:18:54.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.545 "is_configured": false, 00:18:54.545 "data_offset": 2048, 00:18:54.545 "data_size": 63488 00:18:54.545 }, 00:18:54.545 { 00:18:54.545 "name": "BaseBdev2", 00:18:54.545 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:54.545 "is_configured": true, 00:18:54.545 "data_offset": 2048, 00:18:54.545 "data_size": 63488 00:18:54.545 } 00:18:54.545 ] 00:18:54.545 }' 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.545 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.546 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.546 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:54.546 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:54.546 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:54.546 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:18:54.803 [2024-05-15 04:19:42.778123] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:54.803 [2024-05-15 04:19:42.778292] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:18:54.803 [2024-05-15 04:19:42.778313] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:18:54.803 request: 00:18:54.803 { 00:18:54.803 "raid_bdev": "raid_bdev1", 00:18:54.803 "base_bdev": "BaseBdev1", 00:18:54.803 "method": "bdev_raid_add_base_bdev", 00:18:54.803 "req_id": 1 00:18:54.803 } 00:18:54.803 Got JSON-RPC error response 00:18:54.803 response: 00:18:54.803 { 00:18:54.803 "code": -22, 00:18:54.803 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:18:54.803 } 00:18:54.803 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:18:54.803 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:54.803 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:54.803 04:19:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:54.803 04:19:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.175 04:19:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.175 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:18:56.175 "name": "raid_bdev1", 00:18:56.175 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:56.175 "strip_size_kb": 0, 00:18:56.175 "state": "online", 00:18:56.175 "raid_level": "raid1", 00:18:56.176 "superblock": true, 00:18:56.176 "num_base_bdevs": 2, 00:18:56.176 "num_base_bdevs_discovered": 1, 00:18:56.176 "num_base_bdevs_operational": 1, 00:18:56.176 "base_bdevs_list": [ 00:18:56.176 { 00:18:56.176 "name": null, 00:18:56.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.176 "is_configured": false, 00:18:56.176 "data_offset": 2048, 00:18:56.176 "data_size": 63488 00:18:56.176 }, 00:18:56.176 { 00:18:56.176 "name": "BaseBdev2", 00:18:56.176 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:56.176 "is_configured": true, 00:18:56.176 "data_offset": 2048, 00:18:56.176 "data_size": 63488 00:18:56.176 } 00:18:56.176 ] 00:18:56.176 }' 00:18:56.176 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:18:56.176 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.742 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:18:57.000 "name": "raid_bdev1", 00:18:57.000 "uuid": "f4a7cb7b-3770-446d-b5ef-c727af9e8116", 00:18:57.000 "strip_size_kb": 0, 00:18:57.000 "state": "online", 00:18:57.000 "raid_level": "raid1", 00:18:57.000 "superblock": true, 00:18:57.000 "num_base_bdevs": 2, 00:18:57.000 "num_base_bdevs_discovered": 1, 00:18:57.000 "num_base_bdevs_operational": 1, 00:18:57.000 "base_bdevs_list": [ 00:18:57.000 { 00:18:57.000 "name": null, 00:18:57.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.000 "is_configured": false, 00:18:57.000 "data_offset": 2048, 00:18:57.000 "data_size": 63488 00:18:57.000 }, 00:18:57.000 { 00:18:57.000 "name": "BaseBdev2", 00:18:57.000 "uuid": "98b944a5-088f-58af-ab40-30d911f71913", 00:18:57.000 "is_configured": true, 00:18:57.000 "data_offset": 2048, 00:18:57.000 "data_size": 63488 00:18:57.000 } 00:18:57.000 ] 00:18:57.000 }' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # killprocess 3908059 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3908059 ']' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 3908059 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:18:57.000 04:19:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3908059 00:18:57.000 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:18:57.000 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:18:57.000 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3908059' 00:18:57.000 killing process with pid 3908059 00:18:57.000 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 3908059 00:18:57.000 Received shutdown signal, test time was about 60.000000 seconds 00:18:57.000 00:18:57.000 Latency(us) 00:18:57.000 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:57.000 =================================================================================================================== 00:18:57.000 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:18:57.000 [2024-05-15 04:19:45.015369] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:57.000 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 3908059 00:18:57.000 [2024-05-15 04:19:45.015487] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:57.000 [2024-05-15 04:19:45.015550] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:57.000 [2024-05-15 04:19:45.015566] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x131ba10 name raid_bdev1, state offline 00:18:57.258 [2024-05-15 04:19:45.048016] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # return 0 00:18:57.517 00:18:57.517 real 0m34.988s 00:18:57.517 user 0m52.312s 00:18:57.517 sys 0m5.382s 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:57.517 ************************************ 00:18:57.517 END TEST raid_rebuild_test_sb 00:18:57.517 ************************************ 00:18:57.517 04:19:45 bdev_raid -- bdev/bdev_raid.sh@813 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:18:57.517 04:19:45 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:18:57.517 04:19:45 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:18:57.517 04:19:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:57.517 ************************************ 00:18:57.517 START TEST raid_rebuild_test_io 00:18:57.517 ************************************ 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 false true true 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local superblock=false 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local background_io=true 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local verify=true 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local strip_size 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local create_arg 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local data_offset 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # '[' false = true ']' 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # raid_pid=3912635 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@598 -- # waitforlisten 3912635 /var/tmp/spdk-raid.sock 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 3912635 ']' 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:57.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:18:57.517 04:19:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:18:57.517 [2024-05-15 04:19:45.413321] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:18:57.517 [2024-05-15 04:19:45.413391] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3912635 ] 00:18:57.517 I/O size of 3145728 is greater than zero copy threshold (65536). 00:18:57.517 Zero copy mechanism will not be used. 00:18:57.517 [2024-05-15 04:19:45.490394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:57.775 [2024-05-15 04:19:45.599899] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:18:57.775 [2024-05-15 04:19:45.671251] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:57.775 [2024-05-15 04:19:45.671299] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.707 04:19:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:18:58.707 04:19:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:18:58.707 04:19:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:58.707 04:19:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:58.707 BaseBdev1_malloc 00:18:58.707 04:19:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:18:58.965 [2024-05-15 04:19:46.895212] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:18:58.965 [2024-05-15 04:19:46.895273] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.965 [2024-05-15 04:19:46.895305] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1670000 00:18:58.965 [2024-05-15 04:19:46.895321] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.965 [2024-05-15 04:19:46.897116] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.965 [2024-05-15 04:19:46.897145] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:58.965 BaseBdev1 00:18:58.965 04:19:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:18:58.965 04:19:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:59.223 BaseBdev2_malloc 00:18:59.223 04:19:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:18:59.481 [2024-05-15 04:19:47.419533] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:18:59.481 [2024-05-15 04:19:47.419599] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.481 [2024-05-15 04:19:47.419628] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x181b2c0 00:18:59.481 [2024-05-15 04:19:47.419644] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.481 [2024-05-15 04:19:47.421383] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.481 [2024-05-15 04:19:47.421412] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:59.481 BaseBdev2 00:18:59.481 04:19:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:18:59.739 spare_malloc 00:18:59.739 04:19:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:18:59.997 spare_delay 00:18:59.997 04:19:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:00.255 [2024-05-15 04:19:48.164334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:00.255 [2024-05-15 04:19:48.164389] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.255 [2024-05-15 04:19:48.164413] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x181f100 00:19:00.255 [2024-05-15 04:19:48.164426] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.255 [2024-05-15 04:19:48.165908] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.255 [2024-05-15 04:19:48.165932] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:00.255 spare 00:19:00.255 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:00.513 [2024-05-15 04:19:48.417046] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:00.513 [2024-05-15 04:19:48.418357] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:00.513 [2024-05-15 04:19:48.418447] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x181f930 00:19:00.513 [2024-05-15 04:19:48.418461] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:00.513 [2024-05-15 04:19:48.418672] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x181b130 00:19:00.513 [2024-05-15 04:19:48.418871] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x181f930 00:19:00.513 [2024-05-15 04:19:48.418893] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x181f930 00:19:00.513 [2024-05-15 04:19:48.419041] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.513 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.771 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:00.771 "name": "raid_bdev1", 00:19:00.771 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:00.771 "strip_size_kb": 0, 00:19:00.771 "state": "online", 00:19:00.771 "raid_level": "raid1", 00:19:00.771 "superblock": false, 00:19:00.771 "num_base_bdevs": 2, 00:19:00.771 "num_base_bdevs_discovered": 2, 00:19:00.771 "num_base_bdevs_operational": 2, 00:19:00.771 "base_bdevs_list": [ 00:19:00.771 { 00:19:00.771 "name": "BaseBdev1", 00:19:00.771 "uuid": "2726227a-5520-5991-9c96-9645f7073ac5", 00:19:00.771 "is_configured": true, 00:19:00.771 "data_offset": 0, 00:19:00.771 "data_size": 65536 00:19:00.771 }, 00:19:00.771 { 00:19:00.771 "name": "BaseBdev2", 00:19:00.771 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:00.771 "is_configured": true, 00:19:00.771 "data_offset": 0, 00:19:00.771 "data_size": 65536 00:19:00.771 } 00:19:00.771 ] 00:19:00.771 }' 00:19:00.771 04:19:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:00.771 04:19:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:01.337 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:01.337 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:19:01.595 [2024-05-15 04:19:49.451975] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:01.595 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=65536 00:19:01.595 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.595 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:01.853 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # data_offset=0 00:19:01.853 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # '[' true = true ']' 00:19:01.853 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:01.853 04:19:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:01.853 [2024-05-15 04:19:49.823699] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x181a5a0 00:19:01.853 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:01.853 Zero copy mechanism will not be used. 00:19:01.853 Running I/O for 60 seconds... 00:19:02.110 [2024-05-15 04:19:49.979987] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:02.110 [2024-05-15 04:19:49.987161] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x181a5a0 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.110 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.482 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:02.482 "name": "raid_bdev1", 00:19:02.482 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:02.482 "strip_size_kb": 0, 00:19:02.482 "state": "online", 00:19:02.482 "raid_level": "raid1", 00:19:02.482 "superblock": false, 00:19:02.482 "num_base_bdevs": 2, 00:19:02.482 "num_base_bdevs_discovered": 1, 00:19:02.482 "num_base_bdevs_operational": 1, 00:19:02.482 "base_bdevs_list": [ 00:19:02.482 { 00:19:02.482 "name": null, 00:19:02.482 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:02.482 "is_configured": false, 00:19:02.482 "data_offset": 0, 00:19:02.482 "data_size": 65536 00:19:02.482 }, 00:19:02.482 { 00:19:02.482 "name": "BaseBdev2", 00:19:02.482 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:02.482 "is_configured": true, 00:19:02.482 "data_offset": 0, 00:19:02.482 "data_size": 65536 00:19:02.482 } 00:19:02.482 ] 00:19:02.482 }' 00:19:02.482 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:02.482 04:19:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:03.047 04:19:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:03.305 [2024-05-15 04:19:51.080003] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:03.305 04:19:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@647 -- # sleep 1 00:19:03.305 [2024-05-15 04:19:51.122345] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1812190 00:19:03.305 [2024-05-15 04:19:51.124474] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:03.305 [2024-05-15 04:19:51.240887] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:03.305 [2024-05-15 04:19:51.241463] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:03.562 [2024-05-15 04:19:51.458998] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:03.562 [2024-05-15 04:19:51.459294] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:03.820 [2024-05-15 04:19:51.819308] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:04.077 [2024-05-15 04:19:52.023263] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:04.077 [2024-05-15 04:19:52.023526] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.335 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:04.594 [2024-05-15 04:19:52.369435] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:04.594 "name": "raid_bdev1", 00:19:04.594 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:04.594 "strip_size_kb": 0, 00:19:04.594 "state": "online", 00:19:04.594 "raid_level": "raid1", 00:19:04.594 "superblock": false, 00:19:04.594 "num_base_bdevs": 2, 00:19:04.594 "num_base_bdevs_discovered": 2, 00:19:04.594 "num_base_bdevs_operational": 2, 00:19:04.594 "process": { 00:19:04.594 "type": "rebuild", 00:19:04.594 "target": "spare", 00:19:04.594 "progress": { 00:19:04.594 "blocks": 14336, 00:19:04.594 "percent": 21 00:19:04.594 } 00:19:04.594 }, 00:19:04.594 "base_bdevs_list": [ 00:19:04.594 { 00:19:04.594 "name": "spare", 00:19:04.594 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:04.594 "is_configured": true, 00:19:04.594 "data_offset": 0, 00:19:04.594 "data_size": 65536 00:19:04.594 }, 00:19:04.594 { 00:19:04.594 "name": "BaseBdev2", 00:19:04.594 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:04.594 "is_configured": true, 00:19:04.594 "data_offset": 0, 00:19:04.594 "data_size": 65536 00:19:04.594 } 00:19:04.594 ] 00:19:04.594 }' 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:04.594 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:04.852 [2024-05-15 04:19:52.764810] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:04.852 [2024-05-15 04:19:52.792438] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:04.852 [2024-05-15 04:19:52.802003] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:04.852 [2024-05-15 04:19:52.824559] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x181a5a0 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.852 04:19:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:05.417 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:05.417 "name": "raid_bdev1", 00:19:05.417 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:05.417 "strip_size_kb": 0, 00:19:05.417 "state": "online", 00:19:05.417 "raid_level": "raid1", 00:19:05.417 "superblock": false, 00:19:05.417 "num_base_bdevs": 2, 00:19:05.417 "num_base_bdevs_discovered": 1, 00:19:05.417 "num_base_bdevs_operational": 1, 00:19:05.417 "base_bdevs_list": [ 00:19:05.417 { 00:19:05.417 "name": null, 00:19:05.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.417 "is_configured": false, 00:19:05.417 "data_offset": 0, 00:19:05.417 "data_size": 65536 00:19:05.417 }, 00:19:05.417 { 00:19:05.417 "name": "BaseBdev2", 00:19:05.417 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:05.417 "is_configured": true, 00:19:05.417 "data_offset": 0, 00:19:05.417 "data_size": 65536 00:19:05.417 } 00:19:05.417 ] 00:19:05.417 }' 00:19:05.417 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:05.417 04:19:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.982 04:19:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.239 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:06.240 "name": "raid_bdev1", 00:19:06.240 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:06.240 "strip_size_kb": 0, 00:19:06.240 "state": "online", 00:19:06.240 "raid_level": "raid1", 00:19:06.240 "superblock": false, 00:19:06.240 "num_base_bdevs": 2, 00:19:06.240 "num_base_bdevs_discovered": 1, 00:19:06.240 "num_base_bdevs_operational": 1, 00:19:06.240 "base_bdevs_list": [ 00:19:06.240 { 00:19:06.240 "name": null, 00:19:06.240 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:06.240 "is_configured": false, 00:19:06.240 "data_offset": 0, 00:19:06.240 "data_size": 65536 00:19:06.240 }, 00:19:06.240 { 00:19:06.240 "name": "BaseBdev2", 00:19:06.240 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:06.240 "is_configured": true, 00:19:06.240 "data_offset": 0, 00:19:06.240 "data_size": 65536 00:19:06.240 } 00:19:06.240 ] 00:19:06.240 }' 00:19:06.240 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:06.240 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:06.240 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:06.240 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:06.240 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:06.498 [2024-05-15 04:19:54.380191] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:06.498 04:19:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # sleep 1 00:19:06.498 [2024-05-15 04:19:54.450432] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x181a770 00:19:06.498 [2024-05-15 04:19:54.451997] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:06.755 [2024-05-15 04:19:54.553914] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:06.755 [2024-05-15 04:19:54.554347] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:06.755 [2024-05-15 04:19:54.764143] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:06.755 [2024-05-15 04:19:54.764283] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:07.015 [2024-05-15 04:19:55.022206] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.625 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.625 [2024-05-15 04:19:55.474633] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:07.625 [2024-05-15 04:19:55.591709] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:07.625 [2024-05-15 04:19:55.591930] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:07.884 "name": "raid_bdev1", 00:19:07.884 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:07.884 "strip_size_kb": 0, 00:19:07.884 "state": "online", 00:19:07.884 "raid_level": "raid1", 00:19:07.884 "superblock": false, 00:19:07.884 "num_base_bdevs": 2, 00:19:07.884 "num_base_bdevs_discovered": 2, 00:19:07.884 "num_base_bdevs_operational": 2, 00:19:07.884 "process": { 00:19:07.884 "type": "rebuild", 00:19:07.884 "target": "spare", 00:19:07.884 "progress": { 00:19:07.884 "blocks": 16384, 00:19:07.884 "percent": 25 00:19:07.884 } 00:19:07.884 }, 00:19:07.884 "base_bdevs_list": [ 00:19:07.884 { 00:19:07.884 "name": "spare", 00:19:07.884 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:07.884 "is_configured": true, 00:19:07.884 "data_offset": 0, 00:19:07.884 "data_size": 65536 00:19:07.884 }, 00:19:07.884 { 00:19:07.884 "name": "BaseBdev2", 00:19:07.884 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:07.884 "is_configured": true, 00:19:07.884 "data_offset": 0, 00:19:07.884 "data_size": 65536 00:19:07.884 } 00:19:07.884 ] 00:19:07.884 }' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@666 -- # '[' false = true ']' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local timeout=671 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.884 04:19:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:08.142 "name": "raid_bdev1", 00:19:08.142 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:08.142 "strip_size_kb": 0, 00:19:08.142 "state": "online", 00:19:08.142 "raid_level": "raid1", 00:19:08.142 "superblock": false, 00:19:08.142 "num_base_bdevs": 2, 00:19:08.142 "num_base_bdevs_discovered": 2, 00:19:08.142 "num_base_bdevs_operational": 2, 00:19:08.142 "process": { 00:19:08.142 "type": "rebuild", 00:19:08.142 "target": "spare", 00:19:08.142 "progress": { 00:19:08.142 "blocks": 20480, 00:19:08.142 "percent": 31 00:19:08.142 } 00:19:08.142 }, 00:19:08.142 "base_bdevs_list": [ 00:19:08.142 { 00:19:08.142 "name": "spare", 00:19:08.142 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:08.142 "is_configured": true, 00:19:08.142 "data_offset": 0, 00:19:08.142 "data_size": 65536 00:19:08.142 }, 00:19:08.142 { 00:19:08.142 "name": "BaseBdev2", 00:19:08.142 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:08.142 "is_configured": true, 00:19:08.142 "data_offset": 0, 00:19:08.142 "data_size": 65536 00:19:08.142 } 00:19:08.142 ] 00:19:08.142 }' 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:08.142 04:19:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:19:08.400 [2024-05-15 04:19:56.241235] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:19:08.400 [2024-05-15 04:19:56.343740] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:19:08.965 [2024-05-15 04:19:56.679320] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.223 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.223 [2024-05-15 04:19:57.146995] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:09.481 "name": "raid_bdev1", 00:19:09.481 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:09.481 "strip_size_kb": 0, 00:19:09.481 "state": "online", 00:19:09.481 "raid_level": "raid1", 00:19:09.481 "superblock": false, 00:19:09.481 "num_base_bdevs": 2, 00:19:09.481 "num_base_bdevs_discovered": 2, 00:19:09.481 "num_base_bdevs_operational": 2, 00:19:09.481 "process": { 00:19:09.481 "type": "rebuild", 00:19:09.481 "target": "spare", 00:19:09.481 "progress": { 00:19:09.481 "blocks": 40960, 00:19:09.481 "percent": 62 00:19:09.481 } 00:19:09.481 }, 00:19:09.481 "base_bdevs_list": [ 00:19:09.481 { 00:19:09.481 "name": "spare", 00:19:09.481 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:09.481 "is_configured": true, 00:19:09.481 "data_offset": 0, 00:19:09.481 "data_size": 65536 00:19:09.481 }, 00:19:09.481 { 00:19:09.481 "name": "BaseBdev2", 00:19:09.481 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:09.481 "is_configured": true, 00:19:09.481 "data_offset": 0, 00:19:09.481 "data_size": 65536 00:19:09.481 } 00:19:09.481 ] 00:19:09.481 }' 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:09.481 04:19:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:10.856 [2024-05-15 04:19:58.599531] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:10.856 [2024-05-15 04:19:58.705924] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:10.856 [2024-05-15 04:19:58.708691] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:10.856 "name": "raid_bdev1", 00:19:10.856 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:10.856 "strip_size_kb": 0, 00:19:10.856 "state": "online", 00:19:10.856 "raid_level": "raid1", 00:19:10.856 "superblock": false, 00:19:10.856 "num_base_bdevs": 2, 00:19:10.856 "num_base_bdevs_discovered": 2, 00:19:10.856 "num_base_bdevs_operational": 2, 00:19:10.856 "base_bdevs_list": [ 00:19:10.856 { 00:19:10.856 "name": "spare", 00:19:10.856 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:10.856 "is_configured": true, 00:19:10.856 "data_offset": 0, 00:19:10.856 "data_size": 65536 00:19:10.856 }, 00:19:10.856 { 00:19:10.856 "name": "BaseBdev2", 00:19:10.856 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:10.856 "is_configured": true, 00:19:10.856 "data_offset": 0, 00:19:10.856 "data_size": 65536 00:19:10.856 } 00:19:10.856 ] 00:19:10.856 }' 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@709 -- # break 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:10.856 04:19:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.114 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:11.114 "name": "raid_bdev1", 00:19:11.114 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:11.114 "strip_size_kb": 0, 00:19:11.114 "state": "online", 00:19:11.114 "raid_level": "raid1", 00:19:11.114 "superblock": false, 00:19:11.115 "num_base_bdevs": 2, 00:19:11.115 "num_base_bdevs_discovered": 2, 00:19:11.115 "num_base_bdevs_operational": 2, 00:19:11.115 "base_bdevs_list": [ 00:19:11.115 { 00:19:11.115 "name": "spare", 00:19:11.115 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:11.115 "is_configured": true, 00:19:11.115 "data_offset": 0, 00:19:11.115 "data_size": 65536 00:19:11.115 }, 00:19:11.115 { 00:19:11.115 "name": "BaseBdev2", 00:19:11.115 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:11.115 "is_configured": true, 00:19:11.115 "data_offset": 0, 00:19:11.115 "data_size": 65536 00:19:11.115 } 00:19:11.115 ] 00:19:11.115 }' 00:19:11.115 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:11.115 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:11.115 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.373 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.631 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:11.631 "name": "raid_bdev1", 00:19:11.631 "uuid": "216d2a02-b54d-4554-8b35-f53b2e3ac428", 00:19:11.631 "strip_size_kb": 0, 00:19:11.631 "state": "online", 00:19:11.631 "raid_level": "raid1", 00:19:11.631 "superblock": false, 00:19:11.631 "num_base_bdevs": 2, 00:19:11.631 "num_base_bdevs_discovered": 2, 00:19:11.631 "num_base_bdevs_operational": 2, 00:19:11.631 "base_bdevs_list": [ 00:19:11.631 { 00:19:11.631 "name": "spare", 00:19:11.631 "uuid": "aeb9360c-e1a2-5877-8c94-fed885b39866", 00:19:11.631 "is_configured": true, 00:19:11.631 "data_offset": 0, 00:19:11.631 "data_size": 65536 00:19:11.631 }, 00:19:11.631 { 00:19:11.631 "name": "BaseBdev2", 00:19:11.631 "uuid": "601e741e-84ed-54c3-a388-02986fbb8542", 00:19:11.631 "is_configured": true, 00:19:11.631 "data_offset": 0, 00:19:11.631 "data_size": 65536 00:19:11.631 } 00:19:11.631 ] 00:19:11.631 }' 00:19:11.631 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:11.631 04:19:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:12.197 04:19:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:12.197 [2024-05-15 04:20:00.171925] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:12.197 [2024-05-15 04:20:00.171961] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.455 00:19:12.455 Latency(us) 00:19:12.455 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:12.455 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:12.455 raid_bdev1 : 10.38 111.22 333.67 0.00 0.00 12252.76 270.03 117285.17 00:19:12.455 =================================================================================================================== 00:19:12.455 Total : 111.22 333.67 0.00 0.00 12252.76 270.03 117285.17 00:19:12.455 [2024-05-15 04:20:00.239970] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:12.455 [2024-05-15 04:20:00.240005] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.455 [2024-05-15 04:20:00.240080] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:12.455 [2024-05-15 04:20:00.240098] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x181f930 name raid_bdev1, state offline 00:19:12.455 0 00:19:12.455 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.455 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # jq length 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # '[' true = true ']' 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:12.713 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:12.971 /dev/nbd0 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:12.971 1+0 records in 00:19:12.971 1+0 records out 00:19:12.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237283 s, 17.3 MB/s 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev2 ']' 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:12.971 04:20:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:13.230 /dev/nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:13.230 1+0 records in 00:19:13.230 1+0 records out 00:19:13.230 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187014 s, 21.9 MB/s 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:13.230 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:13.488 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # '[' false = true ']' 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@783 -- # killprocess 3912635 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 3912635 ']' 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 3912635 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3912635 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3912635' 00:19:13.746 killing process with pid 3912635 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 3912635 00:19:13.746 Received shutdown signal, test time was about 11.880808 seconds 00:19:13.746 00:19:13.746 Latency(us) 00:19:13.746 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:13.746 =================================================================================================================== 00:19:13.746 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:13.746 [2024-05-15 04:20:01.736005] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:13.746 04:20:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 3912635 00:19:14.004 [2024-05-15 04:20:01.765600] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@785 -- # return 0 00:19:14.262 00:19:14.262 real 0m16.698s 00:19:14.262 user 0m26.350s 00:19:14.262 sys 0m2.226s 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:19:14.262 ************************************ 00:19:14.262 END TEST raid_rebuild_test_io 00:19:14.262 ************************************ 00:19:14.262 04:20:02 bdev_raid -- bdev/bdev_raid.sh@814 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:19:14.262 04:20:02 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:19:14.262 04:20:02 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:14.262 04:20:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:14.262 ************************************ 00:19:14.262 START TEST raid_rebuild_test_sb_io 00:19:14.262 ************************************ 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true true true 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local background_io=true 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local verify=true 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local strip_size 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local create_arg 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local data_offset 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # raid_pid=3914848 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # waitforlisten 3914848 /var/tmp/spdk-raid.sock 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 3914848 ']' 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:14.262 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:14.262 04:20:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:14.262 [2024-05-15 04:20:02.173068] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:19:14.262 [2024-05-15 04:20:02.173153] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3914848 ] 00:19:14.262 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:14.262 Zero copy mechanism will not be used. 00:19:14.262 [2024-05-15 04:20:02.255869] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:14.520 [2024-05-15 04:20:02.377239] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:14.520 [2024-05-15 04:20:02.446433] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:14.520 [2024-05-15 04:20:02.446463] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:15.452 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:15.452 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:19:15.452 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:15.452 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:15.452 BaseBdev1_malloc 00:19:15.452 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:15.711 [2024-05-15 04:20:03.581454] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:15.711 [2024-05-15 04:20:03.581518] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:15.711 [2024-05-15 04:20:03.581545] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1491000 00:19:15.711 [2024-05-15 04:20:03.581561] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:15.711 [2024-05-15 04:20:03.583171] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:15.711 [2024-05-15 04:20:03.583211] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:15.711 BaseBdev1 00:19:15.711 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:15.711 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:15.969 BaseBdev2_malloc 00:19:15.969 04:20:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:16.227 [2024-05-15 04:20:04.070238] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:16.227 [2024-05-15 04:20:04.070311] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:16.227 [2024-05-15 04:20:04.070340] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x163c2c0 00:19:16.227 [2024-05-15 04:20:04.070358] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:16.227 [2024-05-15 04:20:04.072076] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:16.227 [2024-05-15 04:20:04.072105] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:16.227 BaseBdev2 00:19:16.227 04:20:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:16.485 spare_malloc 00:19:16.485 04:20:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:16.742 spare_delay 00:19:16.742 04:20:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:17.003 [2024-05-15 04:20:04.803178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:17.003 [2024-05-15 04:20:04.803255] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.003 [2024-05-15 04:20:04.803283] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1640100 00:19:17.003 [2024-05-15 04:20:04.803299] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.003 [2024-05-15 04:20:04.805005] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.003 [2024-05-15 04:20:04.805035] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:17.003 spare 00:19:17.003 04:20:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:19:17.261 [2024-05-15 04:20:05.043849] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.261 [2024-05-15 04:20:05.045013] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:17.261 [2024-05-15 04:20:05.045193] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1640930 00:19:17.261 [2024-05-15 04:20:05.045212] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:17.261 [2024-05-15 04:20:05.045392] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14900f0 00:19:17.261 [2024-05-15 04:20:05.045559] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1640930 00:19:17.261 [2024-05-15 04:20:05.045576] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1640930 00:19:17.261 [2024-05-15 04:20:05.045679] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.261 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:17.519 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:17.519 "name": "raid_bdev1", 00:19:17.519 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:17.519 "strip_size_kb": 0, 00:19:17.519 "state": "online", 00:19:17.519 "raid_level": "raid1", 00:19:17.519 "superblock": true, 00:19:17.519 "num_base_bdevs": 2, 00:19:17.519 "num_base_bdevs_discovered": 2, 00:19:17.519 "num_base_bdevs_operational": 2, 00:19:17.519 "base_bdevs_list": [ 00:19:17.519 { 00:19:17.519 "name": "BaseBdev1", 00:19:17.519 "uuid": "5e5364f0-60e2-5297-a4b0-1f9ea7e9c0bd", 00:19:17.519 "is_configured": true, 00:19:17.519 "data_offset": 2048, 00:19:17.519 "data_size": 63488 00:19:17.519 }, 00:19:17.519 { 00:19:17.519 "name": "BaseBdev2", 00:19:17.519 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:17.519 "is_configured": true, 00:19:17.519 "data_offset": 2048, 00:19:17.519 "data_size": 63488 00:19:17.519 } 00:19:17.519 ] 00:19:17.519 }' 00:19:17.519 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:17.519 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:18.086 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:18.086 04:20:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:19:18.086 [2024-05-15 04:20:06.070733] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:18.086 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=63488 00:19:18.086 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.086 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:18.344 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # data_offset=2048 00:19:18.344 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # '[' true = true ']' 00:19:18.344 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:18.344 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:18.602 [2024-05-15 04:20:06.430159] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16426d0 00:19:18.602 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:18.602 Zero copy mechanism will not be used. 00:19:18.602 Running I/O for 60 seconds... 00:19:18.602 [2024-05-15 04:20:06.551027] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:18.602 [2024-05-15 04:20:06.558380] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16426d0 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.602 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:19.168 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:19.168 "name": "raid_bdev1", 00:19:19.168 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:19.168 "strip_size_kb": 0, 00:19:19.168 "state": "online", 00:19:19.168 "raid_level": "raid1", 00:19:19.168 "superblock": true, 00:19:19.168 "num_base_bdevs": 2, 00:19:19.168 "num_base_bdevs_discovered": 1, 00:19:19.168 "num_base_bdevs_operational": 1, 00:19:19.168 "base_bdevs_list": [ 00:19:19.168 { 00:19:19.168 "name": null, 00:19:19.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.168 "is_configured": false, 00:19:19.168 "data_offset": 2048, 00:19:19.168 "data_size": 63488 00:19:19.168 }, 00:19:19.168 { 00:19:19.168 "name": "BaseBdev2", 00:19:19.168 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:19.168 "is_configured": true, 00:19:19.168 "data_offset": 2048, 00:19:19.168 "data_size": 63488 00:19:19.168 } 00:19:19.168 ] 00:19:19.168 }' 00:19:19.168 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:19.168 04:20:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:19.734 04:20:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:19.991 [2024-05-15 04:20:07.759457] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:19.991 04:20:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@647 -- # sleep 1 00:19:19.992 [2024-05-15 04:20:07.822213] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15257d0 00:19:19.992 [2024-05-15 04:20:07.824344] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:19.992 [2024-05-15 04:20:07.948171] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:19.992 [2024-05-15 04:20:07.948717] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:20.250 [2024-05-15 04:20:08.167272] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:20.250 [2024-05-15 04:20:08.167575] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:20.508 [2024-05-15 04:20:08.426453] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:20.765 [2024-05-15 04:20:08.544508] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.023 04:20:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:21.023 [2024-05-15 04:20:08.916989] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:21.281 "name": "raid_bdev1", 00:19:21.281 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:21.281 "strip_size_kb": 0, 00:19:21.281 "state": "online", 00:19:21.281 "raid_level": "raid1", 00:19:21.281 "superblock": true, 00:19:21.281 "num_base_bdevs": 2, 00:19:21.281 "num_base_bdevs_discovered": 2, 00:19:21.281 "num_base_bdevs_operational": 2, 00:19:21.281 "process": { 00:19:21.281 "type": "rebuild", 00:19:21.281 "target": "spare", 00:19:21.281 "progress": { 00:19:21.281 "blocks": 14336, 00:19:21.281 "percent": 22 00:19:21.281 } 00:19:21.281 }, 00:19:21.281 "base_bdevs_list": [ 00:19:21.281 { 00:19:21.281 "name": "spare", 00:19:21.281 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:21.281 "is_configured": true, 00:19:21.281 "data_offset": 2048, 00:19:21.281 "data_size": 63488 00:19:21.281 }, 00:19:21.281 { 00:19:21.281 "name": "BaseBdev2", 00:19:21.281 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:21.281 "is_configured": true, 00:19:21.281 "data_offset": 2048, 00:19:21.281 "data_size": 63488 00:19:21.281 } 00:19:21.281 ] 00:19:21.281 }' 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:21.281 [2024-05-15 04:20:09.168157] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:21.281 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:21.538 [2024-05-15 04:20:09.401703] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:21.538 [2024-05-15 04:20:09.519079] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:21.538 [2024-05-15 04:20:09.527779] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.539 [2024-05-15 04:20:09.549540] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x16426d0 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.797 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:22.056 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:22.056 "name": "raid_bdev1", 00:19:22.056 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:22.056 "strip_size_kb": 0, 00:19:22.056 "state": "online", 00:19:22.056 "raid_level": "raid1", 00:19:22.056 "superblock": true, 00:19:22.056 "num_base_bdevs": 2, 00:19:22.056 "num_base_bdevs_discovered": 1, 00:19:22.056 "num_base_bdevs_operational": 1, 00:19:22.056 "base_bdevs_list": [ 00:19:22.056 { 00:19:22.056 "name": null, 00:19:22.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.056 "is_configured": false, 00:19:22.056 "data_offset": 2048, 00:19:22.056 "data_size": 63488 00:19:22.056 }, 00:19:22.056 { 00:19:22.056 "name": "BaseBdev2", 00:19:22.056 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:22.056 "is_configured": true, 00:19:22.056 "data_offset": 2048, 00:19:22.056 "data_size": 63488 00:19:22.056 } 00:19:22.056 ] 00:19:22.056 }' 00:19:22.056 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:22.056 04:20:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.623 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:22.882 "name": "raid_bdev1", 00:19:22.882 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:22.882 "strip_size_kb": 0, 00:19:22.882 "state": "online", 00:19:22.882 "raid_level": "raid1", 00:19:22.882 "superblock": true, 00:19:22.882 "num_base_bdevs": 2, 00:19:22.882 "num_base_bdevs_discovered": 1, 00:19:22.882 "num_base_bdevs_operational": 1, 00:19:22.882 "base_bdevs_list": [ 00:19:22.882 { 00:19:22.882 "name": null, 00:19:22.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.882 "is_configured": false, 00:19:22.882 "data_offset": 2048, 00:19:22.882 "data_size": 63488 00:19:22.882 }, 00:19:22.882 { 00:19:22.882 "name": "BaseBdev2", 00:19:22.882 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:22.882 "is_configured": true, 00:19:22.882 "data_offset": 2048, 00:19:22.882 "data_size": 63488 00:19:22.882 } 00:19:22.882 ] 00:19:22.882 }' 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:22.882 04:20:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:23.140 [2024-05-15 04:20:11.087897] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:23.140 04:20:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # sleep 1 00:19:23.140 [2024-05-15 04:20:11.129709] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1640e30 00:19:23.140 [2024-05-15 04:20:11.131349] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:23.398 [2024-05-15 04:20:11.246504] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:23.398 [2024-05-15 04:20:11.247113] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:19:23.656 [2024-05-15 04:20:11.455060] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:23.656 [2024-05-15 04:20:11.455306] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:19:23.914 [2024-05-15 04:20:11.802049] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:19:23.914 [2024-05-15 04:20:11.926814] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:23.914 [2024-05-15 04:20:11.926985] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.172 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.172 [2024-05-15 04:20:12.170620] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:24.172 [2024-05-15 04:20:12.171154] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:19:24.430 [2024-05-15 04:20:12.405341] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:24.430 [2024-05-15 04:20:12.405642] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:19:24.430 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:24.430 "name": "raid_bdev1", 00:19:24.430 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:24.430 "strip_size_kb": 0, 00:19:24.430 "state": "online", 00:19:24.430 "raid_level": "raid1", 00:19:24.430 "superblock": true, 00:19:24.430 "num_base_bdevs": 2, 00:19:24.430 "num_base_bdevs_discovered": 2, 00:19:24.430 "num_base_bdevs_operational": 2, 00:19:24.430 "process": { 00:19:24.430 "type": "rebuild", 00:19:24.430 "target": "spare", 00:19:24.430 "progress": { 00:19:24.430 "blocks": 14336, 00:19:24.430 "percent": 22 00:19:24.430 } 00:19:24.430 }, 00:19:24.430 "base_bdevs_list": [ 00:19:24.430 { 00:19:24.430 "name": "spare", 00:19:24.430 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:24.430 "is_configured": true, 00:19:24.430 "data_offset": 2048, 00:19:24.430 "data_size": 63488 00:19:24.430 }, 00:19:24.430 { 00:19:24.430 "name": "BaseBdev2", 00:19:24.430 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:24.430 "is_configured": true, 00:19:24.430 "data_offset": 2048, 00:19:24.430 "data_size": 63488 00:19:24.430 } 00:19:24.430 ] 00:19:24.430 }' 00:19:24.430 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:24.688 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:24.688 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:19:24.689 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local timeout=688 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.689 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:24.947 "name": "raid_bdev1", 00:19:24.947 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:24.947 "strip_size_kb": 0, 00:19:24.947 "state": "online", 00:19:24.947 "raid_level": "raid1", 00:19:24.947 "superblock": true, 00:19:24.947 "num_base_bdevs": 2, 00:19:24.947 "num_base_bdevs_discovered": 2, 00:19:24.947 "num_base_bdevs_operational": 2, 00:19:24.947 "process": { 00:19:24.947 "type": "rebuild", 00:19:24.947 "target": "spare", 00:19:24.947 "progress": { 00:19:24.947 "blocks": 18432, 00:19:24.947 "percent": 29 00:19:24.947 } 00:19:24.947 }, 00:19:24.947 "base_bdevs_list": [ 00:19:24.947 { 00:19:24.947 "name": "spare", 00:19:24.947 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:24.947 "is_configured": true, 00:19:24.947 "data_offset": 2048, 00:19:24.947 "data_size": 63488 00:19:24.947 }, 00:19:24.947 { 00:19:24.947 "name": "BaseBdev2", 00:19:24.947 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:24.947 "is_configured": true, 00:19:24.947 "data_offset": 2048, 00:19:24.947 "data_size": 63488 00:19:24.947 } 00:19:24.947 ] 00:19:24.947 }' 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:24.947 04:20:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:19:24.947 [2024-05-15 04:20:12.854446] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:24.948 [2024-05-15 04:20:12.854707] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:19:25.514 [2024-05-15 04:20:13.507247] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.080 04:20:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.080 [2024-05-15 04:20:13.977100] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:19:26.080 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:26.080 "name": "raid_bdev1", 00:19:26.080 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:26.080 "strip_size_kb": 0, 00:19:26.080 "state": "online", 00:19:26.080 "raid_level": "raid1", 00:19:26.080 "superblock": true, 00:19:26.080 "num_base_bdevs": 2, 00:19:26.080 "num_base_bdevs_discovered": 2, 00:19:26.080 "num_base_bdevs_operational": 2, 00:19:26.080 "process": { 00:19:26.080 "type": "rebuild", 00:19:26.080 "target": "spare", 00:19:26.080 "progress": { 00:19:26.080 "blocks": 40960, 00:19:26.080 "percent": 64 00:19:26.080 } 00:19:26.080 }, 00:19:26.080 "base_bdevs_list": [ 00:19:26.080 { 00:19:26.080 "name": "spare", 00:19:26.080 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:26.080 "is_configured": true, 00:19:26.080 "data_offset": 2048, 00:19:26.080 "data_size": 63488 00:19:26.080 }, 00:19:26.080 { 00:19:26.080 "name": "BaseBdev2", 00:19:26.080 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:26.080 "is_configured": true, 00:19:26.080 "data_offset": 2048, 00:19:26.080 "data_size": 63488 00:19:26.080 } 00:19:26.080 ] 00:19:26.080 }' 00:19:26.080 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:26.338 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:26.338 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:26.338 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:26.338 04:20:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:19:26.338 [2024-05-15 04:20:14.329611] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:19:26.905 [2024-05-15 04:20:14.886761] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.163 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.427 [2024-05-15 04:20:15.232789] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:19:27.427 [2024-05-15 04:20:15.340180] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:19:27.427 [2024-05-15 04:20:15.342865] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:27.427 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:27.427 "name": "raid_bdev1", 00:19:27.427 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:27.427 "strip_size_kb": 0, 00:19:27.427 "state": "online", 00:19:27.427 "raid_level": "raid1", 00:19:27.427 "superblock": true, 00:19:27.427 "num_base_bdevs": 2, 00:19:27.427 "num_base_bdevs_discovered": 2, 00:19:27.427 "num_base_bdevs_operational": 2, 00:19:27.427 "base_bdevs_list": [ 00:19:27.427 { 00:19:27.427 "name": "spare", 00:19:27.427 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:27.427 "is_configured": true, 00:19:27.427 "data_offset": 2048, 00:19:27.427 "data_size": 63488 00:19:27.427 }, 00:19:27.427 { 00:19:27.427 "name": "BaseBdev2", 00:19:27.427 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:27.427 "is_configured": true, 00:19:27.427 "data_offset": 2048, 00:19:27.427 "data_size": 63488 00:19:27.427 } 00:19:27.427 ] 00:19:27.427 }' 00:19:27.427 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:27.739 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:19:27.739 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@709 -- # break 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:27.740 "name": "raid_bdev1", 00:19:27.740 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:27.740 "strip_size_kb": 0, 00:19:27.740 "state": "online", 00:19:27.740 "raid_level": "raid1", 00:19:27.740 "superblock": true, 00:19:27.740 "num_base_bdevs": 2, 00:19:27.740 "num_base_bdevs_discovered": 2, 00:19:27.740 "num_base_bdevs_operational": 2, 00:19:27.740 "base_bdevs_list": [ 00:19:27.740 { 00:19:27.740 "name": "spare", 00:19:27.740 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:27.740 "is_configured": true, 00:19:27.740 "data_offset": 2048, 00:19:27.740 "data_size": 63488 00:19:27.740 }, 00:19:27.740 { 00:19:27.740 "name": "BaseBdev2", 00:19:27.740 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:27.740 "is_configured": true, 00:19:27.740 "data_offset": 2048, 00:19:27.740 "data_size": 63488 00:19:27.740 } 00:19:27.740 ] 00:19:27.740 }' 00:19:27.740 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.021 04:20:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.279 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:28.279 "name": "raid_bdev1", 00:19:28.279 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:28.279 "strip_size_kb": 0, 00:19:28.279 "state": "online", 00:19:28.279 "raid_level": "raid1", 00:19:28.279 "superblock": true, 00:19:28.279 "num_base_bdevs": 2, 00:19:28.279 "num_base_bdevs_discovered": 2, 00:19:28.279 "num_base_bdevs_operational": 2, 00:19:28.279 "base_bdevs_list": [ 00:19:28.279 { 00:19:28.279 "name": "spare", 00:19:28.279 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:28.279 "is_configured": true, 00:19:28.279 "data_offset": 2048, 00:19:28.279 "data_size": 63488 00:19:28.279 }, 00:19:28.279 { 00:19:28.279 "name": "BaseBdev2", 00:19:28.279 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:28.279 "is_configured": true, 00:19:28.279 "data_offset": 2048, 00:19:28.279 "data_size": 63488 00:19:28.279 } 00:19:28.279 ] 00:19:28.279 }' 00:19:28.279 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:28.279 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:28.846 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:28.846 [2024-05-15 04:20:16.830685] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:28.846 [2024-05-15 04:20:16.830727] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:29.105 00:19:29.105 Latency(us) 00:19:29.105 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:29.105 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:19:29.105 raid_bdev1 : 10.40 113.02 339.07 0.00 0.00 11172.47 227.56 118838.61 00:19:29.105 =================================================================================================================== 00:19:29.105 Total : 113.02 339.07 0.00 0.00 11172.47 227.56 118838.61 00:19:29.105 [2024-05-15 04:20:16.866553] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:29.105 [2024-05-15 04:20:16.866587] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.105 [2024-05-15 04:20:16.866658] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.105 [2024-05-15 04:20:16.866673] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1640930 name raid_bdev1, state offline 00:19:29.105 0 00:19:29.105 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.105 04:20:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # jq length 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # '[' true = true ']' 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:29.363 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:19:29.621 /dev/nbd0 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:29.621 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:29.621 1+0 records in 00:19:29.621 1+0 records out 00:19:29.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243044 s, 16.9 MB/s 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev2 ']' 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:29.622 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:19:29.880 /dev/nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:29.880 1+0 records in 00:19:29.880 1+0 records out 00:19:29.880 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220149 s, 18.6 MB/s 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:29.880 04:20:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:30.138 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:19:30.396 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:30.654 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:30.912 [2024-05-15 04:20:18.857801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:30.912 [2024-05-15 04:20:18.857873] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:30.912 [2024-05-15 04:20:18.857901] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148fb50 00:19:30.912 [2024-05-15 04:20:18.857917] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:30.912 [2024-05-15 04:20:18.859699] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:30.912 [2024-05-15 04:20:18.859729] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:30.912 [2024-05-15 04:20:18.859845] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:30.912 [2024-05-15 04:20:18.859889] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:30.912 [2024-05-15 04:20:18.860025] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:30.912 spare 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:30.912 04:20:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.169 [2024-05-15 04:20:18.960362] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1525dd0 00:19:31.169 [2024-05-15 04:20:18.960380] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:31.169 [2024-05-15 04:20:18.960568] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1490940 00:19:31.169 [2024-05-15 04:20:18.960740] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1525dd0 00:19:31.169 [2024-05-15 04:20:18.960758] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1525dd0 00:19:31.169 [2024-05-15 04:20:18.960891] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:31.169 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:31.169 "name": "raid_bdev1", 00:19:31.169 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:31.169 "strip_size_kb": 0, 00:19:31.169 "state": "online", 00:19:31.169 "raid_level": "raid1", 00:19:31.169 "superblock": true, 00:19:31.169 "num_base_bdevs": 2, 00:19:31.169 "num_base_bdevs_discovered": 2, 00:19:31.169 "num_base_bdevs_operational": 2, 00:19:31.169 "base_bdevs_list": [ 00:19:31.169 { 00:19:31.169 "name": "spare", 00:19:31.169 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:31.169 "is_configured": true, 00:19:31.169 "data_offset": 2048, 00:19:31.169 "data_size": 63488 00:19:31.169 }, 00:19:31.169 { 00:19:31.169 "name": "BaseBdev2", 00:19:31.169 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:31.169 "is_configured": true, 00:19:31.169 "data_offset": 2048, 00:19:31.169 "data_size": 63488 00:19:31.169 } 00:19:31.169 ] 00:19:31.169 }' 00:19:31.169 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:31.169 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.734 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:31.992 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:31.992 "name": "raid_bdev1", 00:19:31.992 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:31.992 "strip_size_kb": 0, 00:19:31.992 "state": "online", 00:19:31.992 "raid_level": "raid1", 00:19:31.992 "superblock": true, 00:19:31.992 "num_base_bdevs": 2, 00:19:31.992 "num_base_bdevs_discovered": 2, 00:19:31.992 "num_base_bdevs_operational": 2, 00:19:31.992 "base_bdevs_list": [ 00:19:31.992 { 00:19:31.992 "name": "spare", 00:19:31.992 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:31.992 "is_configured": true, 00:19:31.992 "data_offset": 2048, 00:19:31.992 "data_size": 63488 00:19:31.992 }, 00:19:31.992 { 00:19:31.992 "name": "BaseBdev2", 00:19:31.992 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:31.992 "is_configured": true, 00:19:31.992 "data_offset": 2048, 00:19:31.992 "data_size": 63488 00:19:31.992 } 00:19:31.992 ] 00:19:31.992 }' 00:19:31.992 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:31.992 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:31.992 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:31.992 04:20:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:31.992 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.992 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:19:32.555 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:19:32.555 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:32.812 [2024-05-15 04:20:20.590720] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.812 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:33.070 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:33.070 "name": "raid_bdev1", 00:19:33.070 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:33.070 "strip_size_kb": 0, 00:19:33.070 "state": "online", 00:19:33.070 "raid_level": "raid1", 00:19:33.070 "superblock": true, 00:19:33.070 "num_base_bdevs": 2, 00:19:33.070 "num_base_bdevs_discovered": 1, 00:19:33.070 "num_base_bdevs_operational": 1, 00:19:33.070 "base_bdevs_list": [ 00:19:33.070 { 00:19:33.070 "name": null, 00:19:33.070 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.070 "is_configured": false, 00:19:33.070 "data_offset": 2048, 00:19:33.070 "data_size": 63488 00:19:33.070 }, 00:19:33.070 { 00:19:33.070 "name": "BaseBdev2", 00:19:33.070 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:33.070 "is_configured": true, 00:19:33.070 "data_offset": 2048, 00:19:33.070 "data_size": 63488 00:19:33.070 } 00:19:33.070 ] 00:19:33.070 }' 00:19:33.070 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:33.070 04:20:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:33.635 04:20:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:33.635 [2024-05-15 04:20:21.633666] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:33.635 [2024-05-15 04:20:21.633906] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:33.635 [2024-05-15 04:20:21.633930] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:33.635 [2024-05-15 04:20:21.633968] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:33.635 [2024-05-15 04:20:21.641389] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x163dab0 00:19:33.635 [2024-05-15 04:20:21.643629] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:33.892 04:20:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # sleep 1 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.825 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.083 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:35.083 "name": "raid_bdev1", 00:19:35.083 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:35.083 "strip_size_kb": 0, 00:19:35.083 "state": "online", 00:19:35.083 "raid_level": "raid1", 00:19:35.083 "superblock": true, 00:19:35.083 "num_base_bdevs": 2, 00:19:35.083 "num_base_bdevs_discovered": 2, 00:19:35.083 "num_base_bdevs_operational": 2, 00:19:35.083 "process": { 00:19:35.083 "type": "rebuild", 00:19:35.083 "target": "spare", 00:19:35.083 "progress": { 00:19:35.083 "blocks": 24576, 00:19:35.083 "percent": 38 00:19:35.083 } 00:19:35.083 }, 00:19:35.083 "base_bdevs_list": [ 00:19:35.083 { 00:19:35.083 "name": "spare", 00:19:35.083 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:35.083 "is_configured": true, 00:19:35.083 "data_offset": 2048, 00:19:35.083 "data_size": 63488 00:19:35.083 }, 00:19:35.083 { 00:19:35.083 "name": "BaseBdev2", 00:19:35.083 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:35.083 "is_configured": true, 00:19:35.083 "data_offset": 2048, 00:19:35.083 "data_size": 63488 00:19:35.083 } 00:19:35.083 ] 00:19:35.083 }' 00:19:35.083 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:35.083 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:35.083 04:20:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:35.083 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:35.083 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:35.341 [2024-05-15 04:20:23.246352] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:35.341 [2024-05-15 04:20:23.256821] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:35.341 [2024-05-15 04:20:23.256884] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.341 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:35.613 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:35.613 "name": "raid_bdev1", 00:19:35.613 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:35.613 "strip_size_kb": 0, 00:19:35.613 "state": "online", 00:19:35.613 "raid_level": "raid1", 00:19:35.613 "superblock": true, 00:19:35.613 "num_base_bdevs": 2, 00:19:35.613 "num_base_bdevs_discovered": 1, 00:19:35.613 "num_base_bdevs_operational": 1, 00:19:35.613 "base_bdevs_list": [ 00:19:35.613 { 00:19:35.613 "name": null, 00:19:35.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.613 "is_configured": false, 00:19:35.613 "data_offset": 2048, 00:19:35.613 "data_size": 63488 00:19:35.613 }, 00:19:35.613 { 00:19:35.613 "name": "BaseBdev2", 00:19:35.613 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:35.613 "is_configured": true, 00:19:35.613 "data_offset": 2048, 00:19:35.613 "data_size": 63488 00:19:35.613 } 00:19:35.613 ] 00:19:35.613 }' 00:19:35.613 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:35.613 04:20:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:36.185 04:20:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:36.442 [2024-05-15 04:20:24.293645] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:36.442 [2024-05-15 04:20:24.293725] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:36.442 [2024-05-15 04:20:24.293754] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16417c0 00:19:36.442 [2024-05-15 04:20:24.293771] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:36.442 [2024-05-15 04:20:24.294236] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:36.442 [2024-05-15 04:20:24.294266] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:36.442 [2024-05-15 04:20:24.294374] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:19:36.442 [2024-05-15 04:20:24.294394] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:19:36.442 [2024-05-15 04:20:24.294415] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:19:36.442 [2024-05-15 04:20:24.294442] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:36.442 [2024-05-15 04:20:24.301381] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x163dab0 00:19:36.442 spare 00:19:36.442 [2024-05-15 04:20:24.302983] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:36.442 04:20:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # sleep 1 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.375 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:37.632 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:37.632 "name": "raid_bdev1", 00:19:37.632 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:37.632 "strip_size_kb": 0, 00:19:37.633 "state": "online", 00:19:37.633 "raid_level": "raid1", 00:19:37.633 "superblock": true, 00:19:37.633 "num_base_bdevs": 2, 00:19:37.633 "num_base_bdevs_discovered": 2, 00:19:37.633 "num_base_bdevs_operational": 2, 00:19:37.633 "process": { 00:19:37.633 "type": "rebuild", 00:19:37.633 "target": "spare", 00:19:37.633 "progress": { 00:19:37.633 "blocks": 24576, 00:19:37.633 "percent": 38 00:19:37.633 } 00:19:37.633 }, 00:19:37.633 "base_bdevs_list": [ 00:19:37.633 { 00:19:37.633 "name": "spare", 00:19:37.633 "uuid": "dfe38458-a9bc-5aa4-b8fb-261ebcdf5157", 00:19:37.633 "is_configured": true, 00:19:37.633 "data_offset": 2048, 00:19:37.633 "data_size": 63488 00:19:37.633 }, 00:19:37.633 { 00:19:37.633 "name": "BaseBdev2", 00:19:37.633 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:37.633 "is_configured": true, 00:19:37.633 "data_offset": 2048, 00:19:37.633 "data_size": 63488 00:19:37.633 } 00:19:37.633 ] 00:19:37.633 }' 00:19:37.633 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:37.633 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:37.633 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:37.633 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:37.633 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:19:37.890 [2024-05-15 04:20:25.863774] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:38.149 [2024-05-15 04:20:25.916378] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:38.149 [2024-05-15 04:20:25.916434] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.149 04:20:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.407 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:38.407 "name": "raid_bdev1", 00:19:38.407 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:38.407 "strip_size_kb": 0, 00:19:38.407 "state": "online", 00:19:38.407 "raid_level": "raid1", 00:19:38.407 "superblock": true, 00:19:38.407 "num_base_bdevs": 2, 00:19:38.407 "num_base_bdevs_discovered": 1, 00:19:38.407 "num_base_bdevs_operational": 1, 00:19:38.407 "base_bdevs_list": [ 00:19:38.407 { 00:19:38.407 "name": null, 00:19:38.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.407 "is_configured": false, 00:19:38.407 "data_offset": 2048, 00:19:38.407 "data_size": 63488 00:19:38.407 }, 00:19:38.407 { 00:19:38.407 "name": "BaseBdev2", 00:19:38.407 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:38.407 "is_configured": true, 00:19:38.407 "data_offset": 2048, 00:19:38.407 "data_size": 63488 00:19:38.407 } 00:19:38.407 ] 00:19:38.407 }' 00:19:38.407 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:38.407 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:38.972 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:38.972 "name": "raid_bdev1", 00:19:38.972 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:38.972 "strip_size_kb": 0, 00:19:38.972 "state": "online", 00:19:38.972 "raid_level": "raid1", 00:19:38.972 "superblock": true, 00:19:38.972 "num_base_bdevs": 2, 00:19:38.972 "num_base_bdevs_discovered": 1, 00:19:38.972 "num_base_bdevs_operational": 1, 00:19:38.972 "base_bdevs_list": [ 00:19:38.972 { 00:19:38.972 "name": null, 00:19:38.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.972 "is_configured": false, 00:19:38.972 "data_offset": 2048, 00:19:38.972 "data_size": 63488 00:19:38.972 }, 00:19:38.972 { 00:19:38.972 "name": "BaseBdev2", 00:19:38.972 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:38.972 "is_configured": true, 00:19:38.972 "data_offset": 2048, 00:19:38.972 "data_size": 63488 00:19:38.972 } 00:19:38.972 ] 00:19:38.972 }' 00:19:39.230 04:20:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:39.230 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:39.230 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:39.230 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:39.230 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:19:39.489 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:39.747 [2024-05-15 04:20:27.555383] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:39.747 [2024-05-15 04:20:27.555462] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:39.747 [2024-05-15 04:20:27.555494] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1491230 00:19:39.747 [2024-05-15 04:20:27.555519] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:39.747 [2024-05-15 04:20:27.555950] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:39.747 [2024-05-15 04:20:27.555977] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:39.747 [2024-05-15 04:20:27.556066] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:19:39.747 [2024-05-15 04:20:27.556086] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:39.747 [2024-05-15 04:20:27.556097] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:39.747 BaseBdev1 00:19:39.747 04:20:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # sleep 1 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.681 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:40.939 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:40.939 "name": "raid_bdev1", 00:19:40.939 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:40.939 "strip_size_kb": 0, 00:19:40.939 "state": "online", 00:19:40.939 "raid_level": "raid1", 00:19:40.939 "superblock": true, 00:19:40.939 "num_base_bdevs": 2, 00:19:40.939 "num_base_bdevs_discovered": 1, 00:19:40.939 "num_base_bdevs_operational": 1, 00:19:40.939 "base_bdevs_list": [ 00:19:40.939 { 00:19:40.939 "name": null, 00:19:40.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:40.939 "is_configured": false, 00:19:40.939 "data_offset": 2048, 00:19:40.939 "data_size": 63488 00:19:40.939 }, 00:19:40.939 { 00:19:40.939 "name": "BaseBdev2", 00:19:40.939 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:40.939 "is_configured": true, 00:19:40.939 "data_offset": 2048, 00:19:40.939 "data_size": 63488 00:19:40.939 } 00:19:40.939 ] 00:19:40.939 }' 00:19:40.939 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:40.939 04:20:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:41.506 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:41.506 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:41.506 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:41.506 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:41.506 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:41.507 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.507 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:41.765 "name": "raid_bdev1", 00:19:41.765 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:41.765 "strip_size_kb": 0, 00:19:41.765 "state": "online", 00:19:41.765 "raid_level": "raid1", 00:19:41.765 "superblock": true, 00:19:41.765 "num_base_bdevs": 2, 00:19:41.765 "num_base_bdevs_discovered": 1, 00:19:41.765 "num_base_bdevs_operational": 1, 00:19:41.765 "base_bdevs_list": [ 00:19:41.765 { 00:19:41.765 "name": null, 00:19:41.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.765 "is_configured": false, 00:19:41.765 "data_offset": 2048, 00:19:41.765 "data_size": 63488 00:19:41.765 }, 00:19:41.765 { 00:19:41.765 "name": "BaseBdev2", 00:19:41.765 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:41.765 "is_configured": true, 00:19:41.765 "data_offset": 2048, 00:19:41.765 "data_size": 63488 00:19:41.765 } 00:19:41.765 ] 00:19:41.765 }' 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:41.765 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:19:42.029 [2024-05-15 04:20:29.933924] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:42.029 [2024-05-15 04:20:29.934106] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:19:42.029 [2024-05-15 04:20:29.934137] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:19:42.029 request: 00:19:42.029 { 00:19:42.029 "raid_bdev": "raid_bdev1", 00:19:42.029 "base_bdev": "BaseBdev1", 00:19:42.029 "method": "bdev_raid_add_base_bdev", 00:19:42.029 "req_id": 1 00:19:42.029 } 00:19:42.029 Got JSON-RPC error response 00:19:42.029 response: 00:19:42.029 { 00:19:42.029 "code": -22, 00:19:42.029 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:19:42.029 } 00:19:42.029 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:19:42.029 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:42.029 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:42.029 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:42.029 04:20:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.966 04:20:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:43.225 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:43.225 "name": "raid_bdev1", 00:19:43.225 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:43.225 "strip_size_kb": 0, 00:19:43.225 "state": "online", 00:19:43.225 "raid_level": "raid1", 00:19:43.225 "superblock": true, 00:19:43.225 "num_base_bdevs": 2, 00:19:43.225 "num_base_bdevs_discovered": 1, 00:19:43.225 "num_base_bdevs_operational": 1, 00:19:43.225 "base_bdevs_list": [ 00:19:43.225 { 00:19:43.225 "name": null, 00:19:43.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.225 "is_configured": false, 00:19:43.225 "data_offset": 2048, 00:19:43.225 "data_size": 63488 00:19:43.225 }, 00:19:43.225 { 00:19:43.225 "name": "BaseBdev2", 00:19:43.225 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:43.225 "is_configured": true, 00:19:43.225 "data_offset": 2048, 00:19:43.225 "data_size": 63488 00:19:43.225 } 00:19:43.225 ] 00:19:43.225 }' 00:19:43.225 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:43.225 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.792 04:20:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:44.050 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:44.050 "name": "raid_bdev1", 00:19:44.050 "uuid": "0f3d3e5c-024f-4ac3-b4d3-5d8ffba363ea", 00:19:44.050 "strip_size_kb": 0, 00:19:44.050 "state": "online", 00:19:44.050 "raid_level": "raid1", 00:19:44.050 "superblock": true, 00:19:44.050 "num_base_bdevs": 2, 00:19:44.050 "num_base_bdevs_discovered": 1, 00:19:44.050 "num_base_bdevs_operational": 1, 00:19:44.050 "base_bdevs_list": [ 00:19:44.050 { 00:19:44.050 "name": null, 00:19:44.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.050 "is_configured": false, 00:19:44.050 "data_offset": 2048, 00:19:44.050 "data_size": 63488 00:19:44.050 }, 00:19:44.050 { 00:19:44.050 "name": "BaseBdev2", 00:19:44.050 "uuid": "d8bd8ff1-ee18-591d-801a-ca9419b013e9", 00:19:44.050 "is_configured": true, 00:19:44.050 "data_offset": 2048, 00:19:44.050 "data_size": 63488 00:19:44.050 } 00:19:44.050 ] 00:19:44.050 }' 00:19:44.050 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # killprocess 3914848 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 3914848 ']' 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 3914848 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3914848 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3914848' 00:19:44.308 killing process with pid 3914848 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 3914848 00:19:44.308 Received shutdown signal, test time was about 25.633247 seconds 00:19:44.308 00:19:44.308 Latency(us) 00:19:44.308 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:44.308 =================================================================================================================== 00:19:44.308 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:19:44.308 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 3914848 00:19:44.308 [2024-05-15 04:20:32.128486] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:44.308 [2024-05-15 04:20:32.128619] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:44.308 [2024-05-15 04:20:32.128679] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:44.308 [2024-05-15 04:20:32.128696] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1525dd0 name raid_bdev1, state offline 00:19:44.308 [2024-05-15 04:20:32.155143] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # return 0 00:19:44.567 00:19:44.567 real 0m30.307s 00:19:44.567 user 0m48.633s 00:19:44.567 sys 0m3.496s 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:19:44.567 ************************************ 00:19:44.567 END TEST raid_rebuild_test_sb_io 00:19:44.567 ************************************ 00:19:44.567 04:20:32 bdev_raid -- bdev/bdev_raid.sh@810 -- # for n in 2 4 00:19:44.567 04:20:32 bdev_raid -- bdev/bdev_raid.sh@811 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:19:44.567 04:20:32 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:19:44.567 04:20:32 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:19:44.567 04:20:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:44.567 ************************************ 00:19:44.567 START TEST raid_rebuild_test 00:19:44.567 ************************************ 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false false true 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=4 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local superblock=false 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local verify=true 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev3 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # echo BaseBdev4 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local strip_size 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local create_arg 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@579 -- # local data_offset 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # '[' false = true ']' 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # raid_pid=3918877 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@598 -- # waitforlisten 3918877 /var/tmp/spdk-raid.sock 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@827 -- # '[' -z 3918877 ']' 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@832 -- # local max_retries=100 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:44.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # xtrace_disable 00:19:44.567 04:20:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.567 [2024-05-15 04:20:32.533265] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:19:44.567 [2024-05-15 04:20:32.533333] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3918877 ] 00:19:44.567 I/O size of 3145728 is greater than zero copy threshold (65536). 00:19:44.567 Zero copy mechanism will not be used. 00:19:44.825 [2024-05-15 04:20:32.608555] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:44.825 [2024-05-15 04:20:32.717784] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:19:44.825 [2024-05-15 04:20:32.795857] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:44.825 [2024-05-15 04:20:32.795900] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.759 04:20:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:19:45.759 04:20:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # return 0 00:19:45.759 04:20:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:45.759 04:20:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:45.759 BaseBdev1_malloc 00:19:46.017 04:20:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:19:46.275 [2024-05-15 04:20:34.062462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:19:46.275 [2024-05-15 04:20:34.062522] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:46.275 [2024-05-15 04:20:34.062551] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b6000 00:19:46.275 [2024-05-15 04:20:34.062564] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:46.275 [2024-05-15 04:20:34.064160] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:46.275 [2024-05-15 04:20:34.064184] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:46.275 BaseBdev1 00:19:46.275 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:46.275 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:46.534 BaseBdev2_malloc 00:19:46.534 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:19:46.792 [2024-05-15 04:20:34.647401] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:19:46.792 [2024-05-15 04:20:34.647463] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:46.792 [2024-05-15 04:20:34.647493] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27612c0 00:19:46.792 [2024-05-15 04:20:34.647507] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:46.792 [2024-05-15 04:20:34.649220] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:46.792 [2024-05-15 04:20:34.649249] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:46.792 BaseBdev2 00:19:46.792 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:46.792 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:47.050 BaseBdev3_malloc 00:19:47.050 04:20:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:19:47.307 [2024-05-15 04:20:35.144170] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:19:47.307 [2024-05-15 04:20:35.144225] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.307 [2024-05-15 04:20:35.144249] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2762e30 00:19:47.307 [2024-05-15 04:20:35.144261] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.307 [2024-05-15 04:20:35.145737] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.307 [2024-05-15 04:20:35.145761] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:47.307 BaseBdev3 00:19:47.307 04:20:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:19:47.307 04:20:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:47.565 BaseBdev4_malloc 00:19:47.565 04:20:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:19:47.823 [2024-05-15 04:20:35.680110] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:19:47.823 [2024-05-15 04:20:35.680181] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:47.823 [2024-05-15 04:20:35.680220] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27661a0 00:19:47.823 [2024-05-15 04:20:35.680237] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:47.823 [2024-05-15 04:20:35.681933] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:47.823 [2024-05-15 04:20:35.681962] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:47.823 BaseBdev4 00:19:47.823 04:20:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:19:48.107 spare_malloc 00:19:48.107 04:20:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:19:48.394 spare_delay 00:19:48.394 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:19:48.651 [2024-05-15 04:20:36.456693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:19:48.652 [2024-05-15 04:20:36.456749] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:48.652 [2024-05-15 04:20:36.456779] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27664b0 00:19:48.652 [2024-05-15 04:20:36.456793] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:48.652 [2024-05-15 04:20:36.458426] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:48.652 [2024-05-15 04:20:36.458450] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:19:48.652 spare 00:19:48.652 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:19:48.910 [2024-05-15 04:20:36.693361] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:48.910 [2024-05-15 04:20:36.694576] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:48.910 [2024-05-15 04:20:36.694632] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:48.911 [2024-05-15 04:20:36.694680] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:48.911 [2024-05-15 04:20:36.694770] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x275de90 00:19:48.911 [2024-05-15 04:20:36.694784] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:48.911 [2024-05-15 04:20:36.695029] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b0410 00:19:48.911 [2024-05-15 04:20:36.695232] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x275de90 00:19:48.911 [2024-05-15 04:20:36.695246] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x275de90 00:19:48.911 [2024-05-15 04:20:36.695386] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.911 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:49.169 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:49.169 "name": "raid_bdev1", 00:19:49.169 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:19:49.169 "strip_size_kb": 0, 00:19:49.169 "state": "online", 00:19:49.169 "raid_level": "raid1", 00:19:49.169 "superblock": false, 00:19:49.169 "num_base_bdevs": 4, 00:19:49.169 "num_base_bdevs_discovered": 4, 00:19:49.169 "num_base_bdevs_operational": 4, 00:19:49.169 "base_bdevs_list": [ 00:19:49.169 { 00:19:49.169 "name": "BaseBdev1", 00:19:49.169 "uuid": "e055d43e-1a01-531e-8ec7-18fad329dda0", 00:19:49.169 "is_configured": true, 00:19:49.169 "data_offset": 0, 00:19:49.169 "data_size": 65536 00:19:49.169 }, 00:19:49.169 { 00:19:49.169 "name": "BaseBdev2", 00:19:49.169 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:19:49.169 "is_configured": true, 00:19:49.169 "data_offset": 0, 00:19:49.169 "data_size": 65536 00:19:49.169 }, 00:19:49.169 { 00:19:49.169 "name": "BaseBdev3", 00:19:49.169 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:19:49.169 "is_configured": true, 00:19:49.169 "data_offset": 0, 00:19:49.169 "data_size": 65536 00:19:49.169 }, 00:19:49.169 { 00:19:49.169 "name": "BaseBdev4", 00:19:49.169 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:19:49.169 "is_configured": true, 00:19:49.169 "data_offset": 0, 00:19:49.169 "data_size": 65536 00:19:49.169 } 00:19:49.169 ] 00:19:49.169 }' 00:19:49.169 04:20:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:49.169 04:20:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:49.735 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:49.735 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:19:49.735 [2024-05-15 04:20:37.736380] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=65536 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@619 -- # data_offset=0 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:19:49.993 04:20:37 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:49.993 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:19:50.559 [2024-05-15 04:20:38.269609] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25acab0 00:19:50.559 /dev/nbd0 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:19:50.559 1+0 records in 00:19:50.559 1+0 records out 00:19:50.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000146792 s, 27.9 MB/s 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:19:50.559 04:20:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:19:57.115 65536+0 records in 00:19:57.115 65536+0 records out 00:19:57.115 33554432 bytes (34 MB, 32 MiB) copied, 6.2735 s, 5.3 MB/s 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:19:57.116 [2024-05-15 04:20:44.863233] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:19:57.116 04:20:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:19:57.116 [2024-05-15 04:20:45.127985] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.374 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:57.632 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:19:57.632 "name": "raid_bdev1", 00:19:57.632 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:19:57.632 "strip_size_kb": 0, 00:19:57.632 "state": "online", 00:19:57.632 "raid_level": "raid1", 00:19:57.632 "superblock": false, 00:19:57.632 "num_base_bdevs": 4, 00:19:57.632 "num_base_bdevs_discovered": 3, 00:19:57.632 "num_base_bdevs_operational": 3, 00:19:57.632 "base_bdevs_list": [ 00:19:57.632 { 00:19:57.632 "name": null, 00:19:57.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.632 "is_configured": false, 00:19:57.632 "data_offset": 0, 00:19:57.632 "data_size": 65536 00:19:57.632 }, 00:19:57.632 { 00:19:57.632 "name": "BaseBdev2", 00:19:57.632 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:19:57.632 "is_configured": true, 00:19:57.632 "data_offset": 0, 00:19:57.632 "data_size": 65536 00:19:57.632 }, 00:19:57.632 { 00:19:57.632 "name": "BaseBdev3", 00:19:57.632 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:19:57.632 "is_configured": true, 00:19:57.632 "data_offset": 0, 00:19:57.632 "data_size": 65536 00:19:57.632 }, 00:19:57.632 { 00:19:57.632 "name": "BaseBdev4", 00:19:57.632 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:19:57.632 "is_configured": true, 00:19:57.632 "data_offset": 0, 00:19:57.632 "data_size": 65536 00:19:57.632 } 00:19:57.632 ] 00:19:57.632 }' 00:19:57.632 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:19:57.632 04:20:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.199 04:20:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:19:58.199 [2024-05-15 04:20:46.194938] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:19:58.199 [2024-05-15 04:20:46.199671] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25acb40 00:19:58.199 [2024-05-15 04:20:46.201670] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:19:58.457 04:20:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@647 -- # sleep 1 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.392 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:59.650 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:19:59.650 "name": "raid_bdev1", 00:19:59.650 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:19:59.650 "strip_size_kb": 0, 00:19:59.650 "state": "online", 00:19:59.650 "raid_level": "raid1", 00:19:59.650 "superblock": false, 00:19:59.650 "num_base_bdevs": 4, 00:19:59.650 "num_base_bdevs_discovered": 4, 00:19:59.650 "num_base_bdevs_operational": 4, 00:19:59.650 "process": { 00:19:59.650 "type": "rebuild", 00:19:59.650 "target": "spare", 00:19:59.650 "progress": { 00:19:59.650 "blocks": 24576, 00:19:59.650 "percent": 37 00:19:59.650 } 00:19:59.650 }, 00:19:59.650 "base_bdevs_list": [ 00:19:59.650 { 00:19:59.650 "name": "spare", 00:19:59.650 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:19:59.650 "is_configured": true, 00:19:59.650 "data_offset": 0, 00:19:59.650 "data_size": 65536 00:19:59.650 }, 00:19:59.650 { 00:19:59.650 "name": "BaseBdev2", 00:19:59.650 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:19:59.650 "is_configured": true, 00:19:59.650 "data_offset": 0, 00:19:59.650 "data_size": 65536 00:19:59.650 }, 00:19:59.650 { 00:19:59.650 "name": "BaseBdev3", 00:19:59.651 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:19:59.651 "is_configured": true, 00:19:59.651 "data_offset": 0, 00:19:59.651 "data_size": 65536 00:19:59.651 }, 00:19:59.651 { 00:19:59.651 "name": "BaseBdev4", 00:19:59.651 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:19:59.651 "is_configured": true, 00:19:59.651 "data_offset": 0, 00:19:59.651 "data_size": 65536 00:19:59.651 } 00:19:59.651 ] 00:19:59.651 }' 00:19:59.651 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:19:59.651 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:19:59.651 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:19:59.651 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:19:59.651 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:19:59.910 [2024-05-15 04:20:47.787963] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:19:59.910 [2024-05-15 04:20:47.815013] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:19:59.910 [2024-05-15 04:20:47.815063] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.910 04:20:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.168 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:00.168 "name": "raid_bdev1", 00:20:00.168 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:00.168 "strip_size_kb": 0, 00:20:00.168 "state": "online", 00:20:00.168 "raid_level": "raid1", 00:20:00.168 "superblock": false, 00:20:00.168 "num_base_bdevs": 4, 00:20:00.168 "num_base_bdevs_discovered": 3, 00:20:00.168 "num_base_bdevs_operational": 3, 00:20:00.168 "base_bdevs_list": [ 00:20:00.168 { 00:20:00.168 "name": null, 00:20:00.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.168 "is_configured": false, 00:20:00.168 "data_offset": 0, 00:20:00.168 "data_size": 65536 00:20:00.168 }, 00:20:00.168 { 00:20:00.168 "name": "BaseBdev2", 00:20:00.168 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:20:00.168 "is_configured": true, 00:20:00.168 "data_offset": 0, 00:20:00.168 "data_size": 65536 00:20:00.168 }, 00:20:00.169 { 00:20:00.169 "name": "BaseBdev3", 00:20:00.169 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:00.169 "is_configured": true, 00:20:00.169 "data_offset": 0, 00:20:00.169 "data_size": 65536 00:20:00.169 }, 00:20:00.169 { 00:20:00.169 "name": "BaseBdev4", 00:20:00.169 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:00.169 "is_configured": true, 00:20:00.169 "data_offset": 0, 00:20:00.169 "data_size": 65536 00:20:00.169 } 00:20:00.169 ] 00:20:00.169 }' 00:20:00.169 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:00.169 04:20:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.735 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:00.993 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:00.993 "name": "raid_bdev1", 00:20:00.993 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:00.993 "strip_size_kb": 0, 00:20:00.993 "state": "online", 00:20:00.993 "raid_level": "raid1", 00:20:00.993 "superblock": false, 00:20:00.993 "num_base_bdevs": 4, 00:20:00.993 "num_base_bdevs_discovered": 3, 00:20:00.993 "num_base_bdevs_operational": 3, 00:20:00.993 "base_bdevs_list": [ 00:20:00.993 { 00:20:00.993 "name": null, 00:20:00.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.993 "is_configured": false, 00:20:00.993 "data_offset": 0, 00:20:00.993 "data_size": 65536 00:20:00.993 }, 00:20:00.993 { 00:20:00.993 "name": "BaseBdev2", 00:20:00.993 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:20:00.993 "is_configured": true, 00:20:00.993 "data_offset": 0, 00:20:00.993 "data_size": 65536 00:20:00.993 }, 00:20:00.993 { 00:20:00.993 "name": "BaseBdev3", 00:20:00.993 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:00.993 "is_configured": true, 00:20:00.993 "data_offset": 0, 00:20:00.993 "data_size": 65536 00:20:00.993 }, 00:20:00.993 { 00:20:00.993 "name": "BaseBdev4", 00:20:00.993 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:00.993 "is_configured": true, 00:20:00.993 "data_offset": 0, 00:20:00.993 "data_size": 65536 00:20:00.993 } 00:20:00.993 ] 00:20:00.993 }' 00:20:00.993 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:00.993 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:00.993 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:00.994 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:00.994 04:20:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:01.252 [2024-05-15 04:20:49.207898] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:01.252 [2024-05-15 04:20:49.213435] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ae690 00:20:01.252 [2024-05-15 04:20:49.215024] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:01.252 04:20:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # sleep 1 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.627 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:02.627 "name": "raid_bdev1", 00:20:02.627 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:02.627 "strip_size_kb": 0, 00:20:02.627 "state": "online", 00:20:02.627 "raid_level": "raid1", 00:20:02.627 "superblock": false, 00:20:02.627 "num_base_bdevs": 4, 00:20:02.627 "num_base_bdevs_discovered": 4, 00:20:02.627 "num_base_bdevs_operational": 4, 00:20:02.627 "process": { 00:20:02.627 "type": "rebuild", 00:20:02.627 "target": "spare", 00:20:02.627 "progress": { 00:20:02.627 "blocks": 24576, 00:20:02.627 "percent": 37 00:20:02.627 } 00:20:02.627 }, 00:20:02.627 "base_bdevs_list": [ 00:20:02.627 { 00:20:02.627 "name": "spare", 00:20:02.627 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:02.627 "is_configured": true, 00:20:02.627 "data_offset": 0, 00:20:02.627 "data_size": 65536 00:20:02.627 }, 00:20:02.627 { 00:20:02.627 "name": "BaseBdev2", 00:20:02.627 "uuid": "afda1a39-5831-5727-9048-61945fabc721", 00:20:02.627 "is_configured": true, 00:20:02.627 "data_offset": 0, 00:20:02.627 "data_size": 65536 00:20:02.627 }, 00:20:02.627 { 00:20:02.627 "name": "BaseBdev3", 00:20:02.627 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:02.627 "is_configured": true, 00:20:02.627 "data_offset": 0, 00:20:02.627 "data_size": 65536 00:20:02.627 }, 00:20:02.627 { 00:20:02.627 "name": "BaseBdev4", 00:20:02.627 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:02.628 "is_configured": true, 00:20:02.628 "data_offset": 0, 00:20:02.628 "data_size": 65536 00:20:02.628 } 00:20:02.628 ] 00:20:02.628 }' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@666 -- # '[' false = true ']' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=4 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@693 -- # '[' 4 -gt 2 ']' 00:20:02.628 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@695 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:02.886 [2024-05-15 04:20:50.861879] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:03.145 [2024-05-15 04:20:50.929209] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x25ae690 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # base_bdevs[1]= 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@699 -- # (( num_base_bdevs_operational-- )) 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@702 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.145 04:20:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:03.403 "name": "raid_bdev1", 00:20:03.403 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:03.403 "strip_size_kb": 0, 00:20:03.403 "state": "online", 00:20:03.403 "raid_level": "raid1", 00:20:03.403 "superblock": false, 00:20:03.403 "num_base_bdevs": 4, 00:20:03.403 "num_base_bdevs_discovered": 3, 00:20:03.403 "num_base_bdevs_operational": 3, 00:20:03.403 "process": { 00:20:03.403 "type": "rebuild", 00:20:03.403 "target": "spare", 00:20:03.403 "progress": { 00:20:03.403 "blocks": 38912, 00:20:03.403 "percent": 59 00:20:03.403 } 00:20:03.403 }, 00:20:03.403 "base_bdevs_list": [ 00:20:03.403 { 00:20:03.403 "name": "spare", 00:20:03.403 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:03.403 "is_configured": true, 00:20:03.403 "data_offset": 0, 00:20:03.403 "data_size": 65536 00:20:03.403 }, 00:20:03.403 { 00:20:03.403 "name": null, 00:20:03.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.403 "is_configured": false, 00:20:03.403 "data_offset": 0, 00:20:03.403 "data_size": 65536 00:20:03.403 }, 00:20:03.403 { 00:20:03.403 "name": "BaseBdev3", 00:20:03.403 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:03.403 "is_configured": true, 00:20:03.403 "data_offset": 0, 00:20:03.403 "data_size": 65536 00:20:03.403 }, 00:20:03.403 { 00:20:03.403 "name": "BaseBdev4", 00:20:03.403 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:03.403 "is_configured": true, 00:20:03.403 "data_offset": 0, 00:20:03.403 "data_size": 65536 00:20:03.403 } 00:20:03.403 ] 00:20:03.403 }' 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local timeout=727 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.403 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:03.662 "name": "raid_bdev1", 00:20:03.662 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:03.662 "strip_size_kb": 0, 00:20:03.662 "state": "online", 00:20:03.662 "raid_level": "raid1", 00:20:03.662 "superblock": false, 00:20:03.662 "num_base_bdevs": 4, 00:20:03.662 "num_base_bdevs_discovered": 3, 00:20:03.662 "num_base_bdevs_operational": 3, 00:20:03.662 "process": { 00:20:03.662 "type": "rebuild", 00:20:03.662 "target": "spare", 00:20:03.662 "progress": { 00:20:03.662 "blocks": 45056, 00:20:03.662 "percent": 68 00:20:03.662 } 00:20:03.662 }, 00:20:03.662 "base_bdevs_list": [ 00:20:03.662 { 00:20:03.662 "name": "spare", 00:20:03.662 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:03.662 "is_configured": true, 00:20:03.662 "data_offset": 0, 00:20:03.662 "data_size": 65536 00:20:03.662 }, 00:20:03.662 { 00:20:03.662 "name": null, 00:20:03.662 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.662 "is_configured": false, 00:20:03.662 "data_offset": 0, 00:20:03.662 "data_size": 65536 00:20:03.662 }, 00:20:03.662 { 00:20:03.662 "name": "BaseBdev3", 00:20:03.662 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:03.662 "is_configured": true, 00:20:03.662 "data_offset": 0, 00:20:03.662 "data_size": 65536 00:20:03.662 }, 00:20:03.662 { 00:20:03.662 "name": "BaseBdev4", 00:20:03.662 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:03.662 "is_configured": true, 00:20:03.662 "data_offset": 0, 00:20:03.662 "data_size": 65536 00:20:03.662 } 00:20:03.662 ] 00:20:03.662 }' 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:03.662 04:20:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@711 -- # sleep 1 00:20:04.598 [2024-05-15 04:20:52.441394] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:04.598 [2024-05-15 04:20:52.441464] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:04.598 [2024-05-15 04:20:52.441516] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:04.857 "name": "raid_bdev1", 00:20:04.857 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:04.857 "strip_size_kb": 0, 00:20:04.857 "state": "online", 00:20:04.857 "raid_level": "raid1", 00:20:04.857 "superblock": false, 00:20:04.857 "num_base_bdevs": 4, 00:20:04.857 "num_base_bdevs_discovered": 3, 00:20:04.857 "num_base_bdevs_operational": 3, 00:20:04.857 "base_bdevs_list": [ 00:20:04.857 { 00:20:04.857 "name": "spare", 00:20:04.857 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:04.857 "is_configured": true, 00:20:04.857 "data_offset": 0, 00:20:04.857 "data_size": 65536 00:20:04.857 }, 00:20:04.857 { 00:20:04.857 "name": null, 00:20:04.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.857 "is_configured": false, 00:20:04.857 "data_offset": 0, 00:20:04.857 "data_size": 65536 00:20:04.857 }, 00:20:04.857 { 00:20:04.857 "name": "BaseBdev3", 00:20:04.857 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:04.857 "is_configured": true, 00:20:04.857 "data_offset": 0, 00:20:04.857 "data_size": 65536 00:20:04.857 }, 00:20:04.857 { 00:20:04.857 "name": "BaseBdev4", 00:20:04.857 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:04.857 "is_configured": true, 00:20:04.857 "data_offset": 0, 00:20:04.857 "data_size": 65536 00:20:04.857 } 00:20:04.857 ] 00:20:04.857 }' 00:20:04.857 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@709 -- # break 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.115 04:20:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:05.373 "name": "raid_bdev1", 00:20:05.373 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:05.373 "strip_size_kb": 0, 00:20:05.373 "state": "online", 00:20:05.373 "raid_level": "raid1", 00:20:05.373 "superblock": false, 00:20:05.373 "num_base_bdevs": 4, 00:20:05.373 "num_base_bdevs_discovered": 3, 00:20:05.373 "num_base_bdevs_operational": 3, 00:20:05.373 "base_bdevs_list": [ 00:20:05.373 { 00:20:05.373 "name": "spare", 00:20:05.373 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:05.373 "is_configured": true, 00:20:05.373 "data_offset": 0, 00:20:05.373 "data_size": 65536 00:20:05.373 }, 00:20:05.373 { 00:20:05.373 "name": null, 00:20:05.373 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.373 "is_configured": false, 00:20:05.373 "data_offset": 0, 00:20:05.373 "data_size": 65536 00:20:05.373 }, 00:20:05.373 { 00:20:05.373 "name": "BaseBdev3", 00:20:05.373 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:05.373 "is_configured": true, 00:20:05.373 "data_offset": 0, 00:20:05.373 "data_size": 65536 00:20:05.373 }, 00:20:05.373 { 00:20:05.373 "name": "BaseBdev4", 00:20:05.373 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:05.373 "is_configured": true, 00:20:05.373 "data_offset": 0, 00:20:05.373 "data_size": 65536 00:20:05.373 } 00:20:05.373 ] 00:20:05.373 }' 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.373 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:05.631 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:05.631 "name": "raid_bdev1", 00:20:05.631 "uuid": "a75cb458-0301-4c02-8b6e-8d883ed7c640", 00:20:05.631 "strip_size_kb": 0, 00:20:05.631 "state": "online", 00:20:05.631 "raid_level": "raid1", 00:20:05.631 "superblock": false, 00:20:05.631 "num_base_bdevs": 4, 00:20:05.631 "num_base_bdevs_discovered": 3, 00:20:05.631 "num_base_bdevs_operational": 3, 00:20:05.631 "base_bdevs_list": [ 00:20:05.631 { 00:20:05.631 "name": "spare", 00:20:05.631 "uuid": "e3016e9d-aebc-514f-9382-305b19e64919", 00:20:05.631 "is_configured": true, 00:20:05.631 "data_offset": 0, 00:20:05.631 "data_size": 65536 00:20:05.631 }, 00:20:05.631 { 00:20:05.631 "name": null, 00:20:05.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.631 "is_configured": false, 00:20:05.631 "data_offset": 0, 00:20:05.631 "data_size": 65536 00:20:05.631 }, 00:20:05.631 { 00:20:05.631 "name": "BaseBdev3", 00:20:05.631 "uuid": "084e467a-dd31-5488-be0b-61fb49240942", 00:20:05.631 "is_configured": true, 00:20:05.631 "data_offset": 0, 00:20:05.631 "data_size": 65536 00:20:05.631 }, 00:20:05.631 { 00:20:05.631 "name": "BaseBdev4", 00:20:05.631 "uuid": "a2322496-cdbe-573e-83d9-f454b1cf72f7", 00:20:05.631 "is_configured": true, 00:20:05.631 "data_offset": 0, 00:20:05.631 "data_size": 65536 00:20:05.631 } 00:20:05.631 ] 00:20:05.631 }' 00:20:05.631 04:20:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:05.631 04:20:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.195 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:06.453 [2024-05-15 04:20:54.271166] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:06.453 [2024-05-15 04:20:54.271195] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:06.453 [2024-05-15 04:20:54.271270] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:06.453 [2024-05-15 04:20:54.271357] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:06.453 [2024-05-15 04:20:54.271374] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x275de90 name raid_bdev1, state offline 00:20:06.453 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.453 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # jq length 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:06.711 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:06.968 /dev/nbd0 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:06.968 1+0 records in 00:20:06.968 1+0 records out 00:20:06.968 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000171307 s, 23.9 MB/s 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:06.968 04:20:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:07.226 /dev/nbd1 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@865 -- # local i 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # break 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:07.226 1+0 records in 00:20:07.226 1+0 records out 00:20:07.226 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254036 s, 16.1 MB/s 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # size=4096 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # return 0 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:07.226 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@743 -- # '[' false = true ']' 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@783 -- # killprocess 3918877 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@946 -- # '[' -z 3918877 ']' 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # kill -0 3918877 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # uname 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3918877 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3918877' 00:20:07.791 killing process with pid 3918877 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@965 -- # kill 3918877 00:20:07.791 Received shutdown signal, test time was about 60.000000 seconds 00:20:07.791 00:20:07.791 Latency(us) 00:20:07.791 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:07.791 =================================================================================================================== 00:20:07.791 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:07.791 [2024-05-15 04:20:55.789637] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:07.791 04:20:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@970 -- # wait 3918877 00:20:08.049 [2024-05-15 04:20:55.850946] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@785 -- # return 0 00:20:08.307 00:20:08.307 real 0m23.656s 00:20:08.307 user 0m33.495s 00:20:08.307 sys 0m4.285s 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.307 ************************************ 00:20:08.307 END TEST raid_rebuild_test 00:20:08.307 ************************************ 00:20:08.307 04:20:56 bdev_raid -- bdev/bdev_raid.sh@812 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:20:08.307 04:20:56 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:08.307 04:20:56 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:08.307 04:20:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:08.307 ************************************ 00:20:08.307 START TEST raid_rebuild_test_sb 00:20:08.307 ************************************ 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true false true 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=4 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local verify=true 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev3 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # echo BaseBdev4 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local strip_size 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local create_arg 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@579 -- # local data_offset 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # raid_pid=3921898 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@598 -- # waitforlisten 3921898 /var/tmp/spdk-raid.sock 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@827 -- # '[' -z 3921898 ']' 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:08.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:08.307 04:20:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.307 [2024-05-15 04:20:56.236735] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:20:08.308 [2024-05-15 04:20:56.236796] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3921898 ] 00:20:08.308 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:08.308 Zero copy mechanism will not be used. 00:20:08.308 [2024-05-15 04:20:56.311732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:08.566 [2024-05-15 04:20:56.420595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:08.566 [2024-05-15 04:20:56.489090] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:08.566 [2024-05-15 04:20:56.489143] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:09.500 04:20:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:09.500 04:20:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # return 0 00:20:09.500 04:20:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:09.500 04:20:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:09.500 BaseBdev1_malloc 00:20:09.500 04:20:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:09.757 [2024-05-15 04:20:57.760481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:09.757 [2024-05-15 04:20:57.760547] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.757 [2024-05-15 04:20:57.760579] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2223000 00:20:09.757 [2024-05-15 04:20:57.760595] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.757 [2024-05-15 04:20:57.762440] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.757 [2024-05-15 04:20:57.762469] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:09.757 BaseBdev1 00:20:10.015 04:20:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:10.015 04:20:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:10.272 BaseBdev2_malloc 00:20:10.272 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:10.529 [2024-05-15 04:20:58.316547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:10.529 [2024-05-15 04:20:58.316620] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.529 [2024-05-15 04:20:58.316648] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23ce2c0 00:20:10.529 [2024-05-15 04:20:58.316663] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.529 [2024-05-15 04:20:58.318371] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.529 [2024-05-15 04:20:58.318400] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:10.529 BaseBdev2 00:20:10.530 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:10.530 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:10.787 BaseBdev3_malloc 00:20:10.787 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:11.045 [2024-05-15 04:20:58.901959] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:11.045 [2024-05-15 04:20:58.902020] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:11.045 [2024-05-15 04:20:58.902058] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23cfe30 00:20:11.045 [2024-05-15 04:20:58.902073] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:11.045 [2024-05-15 04:20:58.903764] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:11.045 [2024-05-15 04:20:58.903794] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:11.045 BaseBdev3 00:20:11.045 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:11.046 04:20:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:11.303 BaseBdev4_malloc 00:20:11.303 04:20:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:11.561 [2024-05-15 04:20:59.467243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:11.561 [2024-05-15 04:20:59.467307] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:11.561 [2024-05-15 04:20:59.467336] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d31a0 00:20:11.561 [2024-05-15 04:20:59.467352] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:11.561 [2024-05-15 04:20:59.468860] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:11.561 [2024-05-15 04:20:59.468889] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:11.561 BaseBdev4 00:20:11.562 04:20:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:11.819 spare_malloc 00:20:11.819 04:20:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:12.078 spare_delay 00:20:12.078 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:12.335 [2024-05-15 04:21:00.255866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:12.335 [2024-05-15 04:21:00.255941] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:12.335 [2024-05-15 04:21:00.255981] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23d34b0 00:20:12.335 [2024-05-15 04:21:00.255996] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:12.335 [2024-05-15 04:21:00.257796] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:12.335 [2024-05-15 04:21:00.257839] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:12.335 spare 00:20:12.335 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:12.649 [2024-05-15 04:21:00.504577] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:12.649 [2024-05-15 04:21:00.506013] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:12.649 [2024-05-15 04:21:00.506080] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:12.649 [2024-05-15 04:21:00.506151] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:12.649 [2024-05-15 04:21:00.506382] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x23cae90 00:20:12.649 [2024-05-15 04:21:00.506400] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:12.649 [2024-05-15 04:21:00.506633] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221a150 00:20:12.649 [2024-05-15 04:21:00.506836] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23cae90 00:20:12.649 [2024-05-15 04:21:00.506854] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23cae90 00:20:12.649 [2024-05-15 04:21:00.506993] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.650 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:12.929 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:12.929 "name": "raid_bdev1", 00:20:12.929 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:12.929 "strip_size_kb": 0, 00:20:12.929 "state": "online", 00:20:12.929 "raid_level": "raid1", 00:20:12.929 "superblock": true, 00:20:12.929 "num_base_bdevs": 4, 00:20:12.929 "num_base_bdevs_discovered": 4, 00:20:12.929 "num_base_bdevs_operational": 4, 00:20:12.929 "base_bdevs_list": [ 00:20:12.929 { 00:20:12.929 "name": "BaseBdev1", 00:20:12.929 "uuid": "803760b0-3da4-5628-adc8-ef23e31844c9", 00:20:12.929 "is_configured": true, 00:20:12.929 "data_offset": 2048, 00:20:12.929 "data_size": 63488 00:20:12.929 }, 00:20:12.929 { 00:20:12.929 "name": "BaseBdev2", 00:20:12.929 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:12.929 "is_configured": true, 00:20:12.929 "data_offset": 2048, 00:20:12.929 "data_size": 63488 00:20:12.929 }, 00:20:12.929 { 00:20:12.929 "name": "BaseBdev3", 00:20:12.929 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:12.929 "is_configured": true, 00:20:12.929 "data_offset": 2048, 00:20:12.929 "data_size": 63488 00:20:12.929 }, 00:20:12.929 { 00:20:12.929 "name": "BaseBdev4", 00:20:12.929 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:12.929 "is_configured": true, 00:20:12.929 "data_offset": 2048, 00:20:12.929 "data_size": 63488 00:20:12.929 } 00:20:12.929 ] 00:20:12.929 }' 00:20:12.929 04:21:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:12.929 04:21:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.493 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:13.493 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:20:13.750 [2024-05-15 04:21:01.543515] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.750 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=63488 00:20:13.750 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.750 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@619 -- # data_offset=2048 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:14.007 04:21:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:20:14.265 [2024-05-15 04:21:02.052703] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221a150 00:20:14.265 /dev/nbd0 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:14.265 1+0 records in 00:20:14.265 1+0 records out 00:20:14.265 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000206372 s, 19.8 MB/s 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:20:14.265 04:21:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:20:22.371 63488+0 records in 00:20:22.371 63488+0 records out 00:20:22.371 32505856 bytes (33 MB, 31 MiB) copied, 6.95561 s, 4.7 MB/s 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:22.371 [2024-05-15 04:21:09.347159] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:22.371 [2024-05-15 04:21:09.607901] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:22.371 "name": "raid_bdev1", 00:20:22.371 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:22.371 "strip_size_kb": 0, 00:20:22.371 "state": "online", 00:20:22.371 "raid_level": "raid1", 00:20:22.371 "superblock": true, 00:20:22.371 "num_base_bdevs": 4, 00:20:22.371 "num_base_bdevs_discovered": 3, 00:20:22.371 "num_base_bdevs_operational": 3, 00:20:22.371 "base_bdevs_list": [ 00:20:22.371 { 00:20:22.371 "name": null, 00:20:22.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:22.371 "is_configured": false, 00:20:22.371 "data_offset": 2048, 00:20:22.371 "data_size": 63488 00:20:22.371 }, 00:20:22.371 { 00:20:22.371 "name": "BaseBdev2", 00:20:22.371 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:22.371 "is_configured": true, 00:20:22.371 "data_offset": 2048, 00:20:22.371 "data_size": 63488 00:20:22.371 }, 00:20:22.371 { 00:20:22.371 "name": "BaseBdev3", 00:20:22.371 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:22.371 "is_configured": true, 00:20:22.371 "data_offset": 2048, 00:20:22.371 "data_size": 63488 00:20:22.371 }, 00:20:22.371 { 00:20:22.371 "name": "BaseBdev4", 00:20:22.371 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:22.371 "is_configured": true, 00:20:22.371 "data_offset": 2048, 00:20:22.371 "data_size": 63488 00:20:22.371 } 00:20:22.371 ] 00:20:22.371 }' 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:22.371 04:21:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.629 04:21:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:22.887 [2024-05-15 04:21:10.646710] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:22.887 [2024-05-15 04:21:10.652066] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2ab90 00:20:22.887 [2024-05-15 04:21:10.654353] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:22.887 04:21:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@647 -- # sleep 1 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.818 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:24.076 "name": "raid_bdev1", 00:20:24.076 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:24.076 "strip_size_kb": 0, 00:20:24.076 "state": "online", 00:20:24.076 "raid_level": "raid1", 00:20:24.076 "superblock": true, 00:20:24.076 "num_base_bdevs": 4, 00:20:24.076 "num_base_bdevs_discovered": 4, 00:20:24.076 "num_base_bdevs_operational": 4, 00:20:24.076 "process": { 00:20:24.076 "type": "rebuild", 00:20:24.076 "target": "spare", 00:20:24.076 "progress": { 00:20:24.076 "blocks": 24576, 00:20:24.076 "percent": 38 00:20:24.076 } 00:20:24.076 }, 00:20:24.076 "base_bdevs_list": [ 00:20:24.076 { 00:20:24.076 "name": "spare", 00:20:24.076 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:24.076 "is_configured": true, 00:20:24.076 "data_offset": 2048, 00:20:24.076 "data_size": 63488 00:20:24.076 }, 00:20:24.076 { 00:20:24.076 "name": "BaseBdev2", 00:20:24.076 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:24.076 "is_configured": true, 00:20:24.076 "data_offset": 2048, 00:20:24.076 "data_size": 63488 00:20:24.076 }, 00:20:24.076 { 00:20:24.076 "name": "BaseBdev3", 00:20:24.076 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:24.076 "is_configured": true, 00:20:24.076 "data_offset": 2048, 00:20:24.076 "data_size": 63488 00:20:24.076 }, 00:20:24.076 { 00:20:24.076 "name": "BaseBdev4", 00:20:24.076 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:24.076 "is_configured": true, 00:20:24.076 "data_offset": 2048, 00:20:24.076 "data_size": 63488 00:20:24.076 } 00:20:24.076 ] 00:20:24.076 }' 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:24.076 04:21:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:24.334 [2024-05-15 04:21:12.264100] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:24.334 [2024-05-15 04:21:12.267474] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:24.334 [2024-05-15 04:21:12.267520] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:24.334 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.593 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:24.593 "name": "raid_bdev1", 00:20:24.593 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:24.593 "strip_size_kb": 0, 00:20:24.593 "state": "online", 00:20:24.593 "raid_level": "raid1", 00:20:24.593 "superblock": true, 00:20:24.593 "num_base_bdevs": 4, 00:20:24.593 "num_base_bdevs_discovered": 3, 00:20:24.593 "num_base_bdevs_operational": 3, 00:20:24.593 "base_bdevs_list": [ 00:20:24.593 { 00:20:24.593 "name": null, 00:20:24.593 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:24.593 "is_configured": false, 00:20:24.593 "data_offset": 2048, 00:20:24.593 "data_size": 63488 00:20:24.593 }, 00:20:24.593 { 00:20:24.593 "name": "BaseBdev2", 00:20:24.593 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:24.593 "is_configured": true, 00:20:24.593 "data_offset": 2048, 00:20:24.593 "data_size": 63488 00:20:24.593 }, 00:20:24.593 { 00:20:24.593 "name": "BaseBdev3", 00:20:24.593 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:24.593 "is_configured": true, 00:20:24.593 "data_offset": 2048, 00:20:24.593 "data_size": 63488 00:20:24.593 }, 00:20:24.593 { 00:20:24.593 "name": "BaseBdev4", 00:20:24.593 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:24.593 "is_configured": true, 00:20:24.593 "data_offset": 2048, 00:20:24.593 "data_size": 63488 00:20:24.593 } 00:20:24.593 ] 00:20:24.593 }' 00:20:24.593 04:21:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:24.593 04:21:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.157 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.415 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:25.415 "name": "raid_bdev1", 00:20:25.415 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:25.415 "strip_size_kb": 0, 00:20:25.415 "state": "online", 00:20:25.415 "raid_level": "raid1", 00:20:25.415 "superblock": true, 00:20:25.415 "num_base_bdevs": 4, 00:20:25.415 "num_base_bdevs_discovered": 3, 00:20:25.415 "num_base_bdevs_operational": 3, 00:20:25.415 "base_bdevs_list": [ 00:20:25.415 { 00:20:25.415 "name": null, 00:20:25.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.415 "is_configured": false, 00:20:25.415 "data_offset": 2048, 00:20:25.415 "data_size": 63488 00:20:25.415 }, 00:20:25.415 { 00:20:25.415 "name": "BaseBdev2", 00:20:25.415 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:25.415 "is_configured": true, 00:20:25.415 "data_offset": 2048, 00:20:25.415 "data_size": 63488 00:20:25.415 }, 00:20:25.415 { 00:20:25.415 "name": "BaseBdev3", 00:20:25.415 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:25.415 "is_configured": true, 00:20:25.415 "data_offset": 2048, 00:20:25.415 "data_size": 63488 00:20:25.415 }, 00:20:25.415 { 00:20:25.415 "name": "BaseBdev4", 00:20:25.415 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:25.415 "is_configured": true, 00:20:25.415 "data_offset": 2048, 00:20:25.415 "data_size": 63488 00:20:25.415 } 00:20:25.415 ] 00:20:25.415 }' 00:20:25.415 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:25.415 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:25.415 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:25.671 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:25.671 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:25.671 [2024-05-15 04:21:13.672398] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:25.671 [2024-05-15 04:21:13.677300] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23cdbf0 00:20:25.671 [2024-05-15 04:21:13.678722] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:25.927 04:21:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # sleep 1 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.859 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.117 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:27.117 "name": "raid_bdev1", 00:20:27.117 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:27.117 "strip_size_kb": 0, 00:20:27.117 "state": "online", 00:20:27.117 "raid_level": "raid1", 00:20:27.117 "superblock": true, 00:20:27.117 "num_base_bdevs": 4, 00:20:27.117 "num_base_bdevs_discovered": 4, 00:20:27.117 "num_base_bdevs_operational": 4, 00:20:27.117 "process": { 00:20:27.117 "type": "rebuild", 00:20:27.117 "target": "spare", 00:20:27.117 "progress": { 00:20:27.117 "blocks": 24576, 00:20:27.117 "percent": 38 00:20:27.117 } 00:20:27.117 }, 00:20:27.117 "base_bdevs_list": [ 00:20:27.117 { 00:20:27.117 "name": "spare", 00:20:27.117 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:27.117 "is_configured": true, 00:20:27.117 "data_offset": 2048, 00:20:27.117 "data_size": 63488 00:20:27.117 }, 00:20:27.117 { 00:20:27.117 "name": "BaseBdev2", 00:20:27.117 "uuid": "43ba3175-6b94-5581-9d1b-8594ee37e623", 00:20:27.117 "is_configured": true, 00:20:27.117 "data_offset": 2048, 00:20:27.117 "data_size": 63488 00:20:27.117 }, 00:20:27.117 { 00:20:27.117 "name": "BaseBdev3", 00:20:27.117 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:27.117 "is_configured": true, 00:20:27.117 "data_offset": 2048, 00:20:27.117 "data_size": 63488 00:20:27.117 }, 00:20:27.117 { 00:20:27.117 "name": "BaseBdev4", 00:20:27.117 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:27.117 "is_configured": true, 00:20:27.117 "data_offset": 2048, 00:20:27.117 "data_size": 63488 00:20:27.117 } 00:20:27.117 ] 00:20:27.117 }' 00:20:27.117 04:21:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:20:27.117 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=4 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@693 -- # '[' 4 -gt 2 ']' 00:20:27.117 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@695 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:27.375 [2024-05-15 04:21:15.298331] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:27.634 [2024-05-15 04:21:15.392773] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x23cdbf0 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # base_bdevs[1]= 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@699 -- # (( num_base_bdevs_operational-- )) 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@702 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.634 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:27.892 "name": "raid_bdev1", 00:20:27.892 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:27.892 "strip_size_kb": 0, 00:20:27.892 "state": "online", 00:20:27.892 "raid_level": "raid1", 00:20:27.892 "superblock": true, 00:20:27.892 "num_base_bdevs": 4, 00:20:27.892 "num_base_bdevs_discovered": 3, 00:20:27.892 "num_base_bdevs_operational": 3, 00:20:27.892 "process": { 00:20:27.892 "type": "rebuild", 00:20:27.892 "target": "spare", 00:20:27.892 "progress": { 00:20:27.892 "blocks": 40960, 00:20:27.892 "percent": 64 00:20:27.892 } 00:20:27.892 }, 00:20:27.892 "base_bdevs_list": [ 00:20:27.892 { 00:20:27.892 "name": "spare", 00:20:27.892 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:27.892 "is_configured": true, 00:20:27.892 "data_offset": 2048, 00:20:27.892 "data_size": 63488 00:20:27.892 }, 00:20:27.892 { 00:20:27.892 "name": null, 00:20:27.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:27.892 "is_configured": false, 00:20:27.892 "data_offset": 2048, 00:20:27.892 "data_size": 63488 00:20:27.892 }, 00:20:27.892 { 00:20:27.892 "name": "BaseBdev3", 00:20:27.892 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:27.892 "is_configured": true, 00:20:27.892 "data_offset": 2048, 00:20:27.892 "data_size": 63488 00:20:27.892 }, 00:20:27.892 { 00:20:27.892 "name": "BaseBdev4", 00:20:27.892 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:27.892 "is_configured": true, 00:20:27.892 "data_offset": 2048, 00:20:27.892 "data_size": 63488 00:20:27.892 } 00:20:27.892 ] 00:20:27.892 }' 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local timeout=751 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:27.892 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:27.893 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:27.893 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:27.893 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:27.893 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.893 04:21:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:28.150 "name": "raid_bdev1", 00:20:28.150 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:28.150 "strip_size_kb": 0, 00:20:28.150 "state": "online", 00:20:28.150 "raid_level": "raid1", 00:20:28.150 "superblock": true, 00:20:28.150 "num_base_bdevs": 4, 00:20:28.150 "num_base_bdevs_discovered": 3, 00:20:28.150 "num_base_bdevs_operational": 3, 00:20:28.150 "process": { 00:20:28.150 "type": "rebuild", 00:20:28.150 "target": "spare", 00:20:28.150 "progress": { 00:20:28.150 "blocks": 47104, 00:20:28.150 "percent": 74 00:20:28.150 } 00:20:28.150 }, 00:20:28.150 "base_bdevs_list": [ 00:20:28.150 { 00:20:28.150 "name": "spare", 00:20:28.150 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:28.150 "is_configured": true, 00:20:28.150 "data_offset": 2048, 00:20:28.150 "data_size": 63488 00:20:28.150 }, 00:20:28.150 { 00:20:28.150 "name": null, 00:20:28.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.150 "is_configured": false, 00:20:28.150 "data_offset": 2048, 00:20:28.150 "data_size": 63488 00:20:28.150 }, 00:20:28.150 { 00:20:28.150 "name": "BaseBdev3", 00:20:28.150 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:28.150 "is_configured": true, 00:20:28.150 "data_offset": 2048, 00:20:28.150 "data_size": 63488 00:20:28.150 }, 00:20:28.150 { 00:20:28.150 "name": "BaseBdev4", 00:20:28.150 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:28.150 "is_configured": true, 00:20:28.150 "data_offset": 2048, 00:20:28.150 "data_size": 63488 00:20:28.150 } 00:20:28.150 ] 00:20:28.150 }' 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:28.150 04:21:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@711 -- # sleep 1 00:20:29.082 [2024-05-15 04:21:16.804424] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:20:29.083 [2024-05-15 04:21:16.804488] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:20:29.083 [2024-05-15 04:21:16.804625] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.340 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:29.597 "name": "raid_bdev1", 00:20:29.597 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:29.597 "strip_size_kb": 0, 00:20:29.597 "state": "online", 00:20:29.597 "raid_level": "raid1", 00:20:29.597 "superblock": true, 00:20:29.597 "num_base_bdevs": 4, 00:20:29.597 "num_base_bdevs_discovered": 3, 00:20:29.597 "num_base_bdevs_operational": 3, 00:20:29.597 "base_bdevs_list": [ 00:20:29.597 { 00:20:29.597 "name": "spare", 00:20:29.597 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:29.597 "is_configured": true, 00:20:29.597 "data_offset": 2048, 00:20:29.597 "data_size": 63488 00:20:29.597 }, 00:20:29.597 { 00:20:29.597 "name": null, 00:20:29.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.597 "is_configured": false, 00:20:29.597 "data_offset": 2048, 00:20:29.597 "data_size": 63488 00:20:29.597 }, 00:20:29.597 { 00:20:29.597 "name": "BaseBdev3", 00:20:29.597 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:29.597 "is_configured": true, 00:20:29.597 "data_offset": 2048, 00:20:29.597 "data_size": 63488 00:20:29.597 }, 00:20:29.597 { 00:20:29.597 "name": "BaseBdev4", 00:20:29.597 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:29.597 "is_configured": true, 00:20:29.597 "data_offset": 2048, 00:20:29.597 "data_size": 63488 00:20:29.597 } 00:20:29.597 ] 00:20:29.597 }' 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@709 -- # break 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.597 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:29.854 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:29.854 "name": "raid_bdev1", 00:20:29.855 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:29.855 "strip_size_kb": 0, 00:20:29.855 "state": "online", 00:20:29.855 "raid_level": "raid1", 00:20:29.855 "superblock": true, 00:20:29.855 "num_base_bdevs": 4, 00:20:29.855 "num_base_bdevs_discovered": 3, 00:20:29.855 "num_base_bdevs_operational": 3, 00:20:29.855 "base_bdevs_list": [ 00:20:29.855 { 00:20:29.855 "name": "spare", 00:20:29.855 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:29.855 "is_configured": true, 00:20:29.855 "data_offset": 2048, 00:20:29.855 "data_size": 63488 00:20:29.855 }, 00:20:29.855 { 00:20:29.855 "name": null, 00:20:29.855 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:29.855 "is_configured": false, 00:20:29.855 "data_offset": 2048, 00:20:29.855 "data_size": 63488 00:20:29.855 }, 00:20:29.855 { 00:20:29.855 "name": "BaseBdev3", 00:20:29.855 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:29.855 "is_configured": true, 00:20:29.855 "data_offset": 2048, 00:20:29.855 "data_size": 63488 00:20:29.855 }, 00:20:29.855 { 00:20:29.855 "name": "BaseBdev4", 00:20:29.855 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:29.855 "is_configured": true, 00:20:29.855 "data_offset": 2048, 00:20:29.855 "data_size": 63488 00:20:29.855 } 00:20:29.855 ] 00:20:29.855 }' 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.855 04:21:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.113 04:21:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:30.113 "name": "raid_bdev1", 00:20:30.113 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:30.113 "strip_size_kb": 0, 00:20:30.113 "state": "online", 00:20:30.113 "raid_level": "raid1", 00:20:30.113 "superblock": true, 00:20:30.113 "num_base_bdevs": 4, 00:20:30.113 "num_base_bdevs_discovered": 3, 00:20:30.113 "num_base_bdevs_operational": 3, 00:20:30.113 "base_bdevs_list": [ 00:20:30.113 { 00:20:30.113 "name": "spare", 00:20:30.113 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:30.113 "is_configured": true, 00:20:30.113 "data_offset": 2048, 00:20:30.113 "data_size": 63488 00:20:30.113 }, 00:20:30.113 { 00:20:30.113 "name": null, 00:20:30.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:30.113 "is_configured": false, 00:20:30.113 "data_offset": 2048, 00:20:30.113 "data_size": 63488 00:20:30.113 }, 00:20:30.113 { 00:20:30.113 "name": "BaseBdev3", 00:20:30.113 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:30.113 "is_configured": true, 00:20:30.113 "data_offset": 2048, 00:20:30.113 "data_size": 63488 00:20:30.113 }, 00:20:30.113 { 00:20:30.113 "name": "BaseBdev4", 00:20:30.113 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:30.113 "is_configured": true, 00:20:30.113 "data_offset": 2048, 00:20:30.113 "data_size": 63488 00:20:30.113 } 00:20:30.113 ] 00:20:30.113 }' 00:20:30.113 04:21:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:30.113 04:21:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:30.677 04:21:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:30.935 [2024-05-15 04:21:18.863180] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:30.935 [2024-05-15 04:21:18.863208] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:30.935 [2024-05-15 04:21:18.863289] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:30.935 [2024-05-15 04:21:18.863377] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:30.935 [2024-05-15 04:21:18.863394] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23cae90 name raid_bdev1, state offline 00:20:30.935 04:21:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.935 04:21:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # jq length 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:31.192 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:20:31.450 /dev/nbd0 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:31.450 1+0 records in 00:20:31.450 1+0 records out 00:20:31.450 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000191723 s, 21.4 MB/s 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:31.450 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:20:31.708 /dev/nbd1 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@865 -- # local i 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # break 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:20:31.966 1+0 records in 00:20:31.966 1+0 records out 00:20:31.966 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000221512 s, 18.5 MB/s 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # size=4096 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # return 0 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:31.966 04:21:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:20:32.224 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:20:32.483 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:32.741 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:33.000 [2024-05-15 04:21:20.918070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:33.000 [2024-05-15 04:21:20.918144] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:33.000 [2024-05-15 04:21:20.918177] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221c1f0 00:20:33.000 [2024-05-15 04:21:20.918193] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:33.000 [2024-05-15 04:21:20.919970] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:33.000 [2024-05-15 04:21:20.920001] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:33.000 [2024-05-15 04:21:20.920118] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:33.000 [2024-05-15 04:21:20.920160] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:33.000 [2024-05-15 04:21:20.920303] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:33.000 [2024-05-15 04:21:20.920396] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:33.000 spare 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.000 04:21:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.258 [2024-05-15 04:21:21.020750] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x221b330 00:20:33.258 [2024-05-15 04:21:21.020773] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:33.258 [2024-05-15 04:21:21.021020] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x221bc30 00:20:33.258 [2024-05-15 04:21:21.021234] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x221b330 00:20:33.258 [2024-05-15 04:21:21.021251] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x221b330 00:20:33.258 [2024-05-15 04:21:21.021386] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:33.258 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:33.258 "name": "raid_bdev1", 00:20:33.258 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:33.258 "strip_size_kb": 0, 00:20:33.258 "state": "online", 00:20:33.258 "raid_level": "raid1", 00:20:33.258 "superblock": true, 00:20:33.258 "num_base_bdevs": 4, 00:20:33.258 "num_base_bdevs_discovered": 3, 00:20:33.258 "num_base_bdevs_operational": 3, 00:20:33.258 "base_bdevs_list": [ 00:20:33.258 { 00:20:33.258 "name": "spare", 00:20:33.258 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:33.258 "is_configured": true, 00:20:33.258 "data_offset": 2048, 00:20:33.258 "data_size": 63488 00:20:33.258 }, 00:20:33.258 { 00:20:33.258 "name": null, 00:20:33.258 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.258 "is_configured": false, 00:20:33.258 "data_offset": 2048, 00:20:33.258 "data_size": 63488 00:20:33.258 }, 00:20:33.258 { 00:20:33.258 "name": "BaseBdev3", 00:20:33.258 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:33.258 "is_configured": true, 00:20:33.258 "data_offset": 2048, 00:20:33.258 "data_size": 63488 00:20:33.258 }, 00:20:33.258 { 00:20:33.258 "name": "BaseBdev4", 00:20:33.258 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:33.258 "is_configured": true, 00:20:33.258 "data_offset": 2048, 00:20:33.258 "data_size": 63488 00:20:33.258 } 00:20:33.258 ] 00:20:33.258 }' 00:20:33.258 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:33.258 04:21:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.823 04:21:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.081 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:34.081 "name": "raid_bdev1", 00:20:34.081 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:34.081 "strip_size_kb": 0, 00:20:34.081 "state": "online", 00:20:34.081 "raid_level": "raid1", 00:20:34.081 "superblock": true, 00:20:34.081 "num_base_bdevs": 4, 00:20:34.081 "num_base_bdevs_discovered": 3, 00:20:34.081 "num_base_bdevs_operational": 3, 00:20:34.081 "base_bdevs_list": [ 00:20:34.081 { 00:20:34.081 "name": "spare", 00:20:34.081 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:34.081 "is_configured": true, 00:20:34.081 "data_offset": 2048, 00:20:34.081 "data_size": 63488 00:20:34.081 }, 00:20:34.081 { 00:20:34.081 "name": null, 00:20:34.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.081 "is_configured": false, 00:20:34.081 "data_offset": 2048, 00:20:34.081 "data_size": 63488 00:20:34.081 }, 00:20:34.081 { 00:20:34.081 "name": "BaseBdev3", 00:20:34.081 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:34.081 "is_configured": true, 00:20:34.081 "data_offset": 2048, 00:20:34.081 "data_size": 63488 00:20:34.081 }, 00:20:34.081 { 00:20:34.081 "name": "BaseBdev4", 00:20:34.081 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:34.081 "is_configured": true, 00:20:34.081 "data_offset": 2048, 00:20:34.081 "data_size": 63488 00:20:34.081 } 00:20:34.081 ] 00:20:34.081 }' 00:20:34.081 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:34.338 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:34.338 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:34.338 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:34.338 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.338 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:20:34.595 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:20:34.595 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:34.852 [2024-05-15 04:21:22.666786] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.852 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:35.110 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:35.110 "name": "raid_bdev1", 00:20:35.110 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:35.110 "strip_size_kb": 0, 00:20:35.110 "state": "online", 00:20:35.110 "raid_level": "raid1", 00:20:35.110 "superblock": true, 00:20:35.110 "num_base_bdevs": 4, 00:20:35.110 "num_base_bdevs_discovered": 2, 00:20:35.110 "num_base_bdevs_operational": 2, 00:20:35.110 "base_bdevs_list": [ 00:20:35.110 { 00:20:35.110 "name": null, 00:20:35.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:35.110 "is_configured": false, 00:20:35.110 "data_offset": 2048, 00:20:35.110 "data_size": 63488 00:20:35.110 }, 00:20:35.110 { 00:20:35.110 "name": null, 00:20:35.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:35.110 "is_configured": false, 00:20:35.110 "data_offset": 2048, 00:20:35.110 "data_size": 63488 00:20:35.110 }, 00:20:35.110 { 00:20:35.110 "name": "BaseBdev3", 00:20:35.110 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:35.110 "is_configured": true, 00:20:35.110 "data_offset": 2048, 00:20:35.110 "data_size": 63488 00:20:35.110 }, 00:20:35.110 { 00:20:35.110 "name": "BaseBdev4", 00:20:35.110 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:35.110 "is_configured": true, 00:20:35.110 "data_offset": 2048, 00:20:35.110 "data_size": 63488 00:20:35.110 } 00:20:35.110 ] 00:20:35.110 }' 00:20:35.110 04:21:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:35.110 04:21:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.673 04:21:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:35.931 [2024-05-15 04:21:23.757706] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:35.931 [2024-05-15 04:21:23.757929] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:35.931 [2024-05-15 04:21:23.757951] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:35.931 [2024-05-15 04:21:23.757984] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:35.931 [2024-05-15 04:21:23.763047] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2ab90 00:20:35.931 [2024-05-15 04:21:23.765184] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:35.931 04:21:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # sleep 1 00:20:36.860 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:36.860 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:36.860 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:36.861 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:36.861 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:36.861 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.861 04:21:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:37.118 "name": "raid_bdev1", 00:20:37.118 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:37.118 "strip_size_kb": 0, 00:20:37.118 "state": "online", 00:20:37.118 "raid_level": "raid1", 00:20:37.118 "superblock": true, 00:20:37.118 "num_base_bdevs": 4, 00:20:37.118 "num_base_bdevs_discovered": 3, 00:20:37.118 "num_base_bdevs_operational": 3, 00:20:37.118 "process": { 00:20:37.118 "type": "rebuild", 00:20:37.118 "target": "spare", 00:20:37.118 "progress": { 00:20:37.118 "blocks": 24576, 00:20:37.118 "percent": 38 00:20:37.118 } 00:20:37.118 }, 00:20:37.118 "base_bdevs_list": [ 00:20:37.118 { 00:20:37.118 "name": "spare", 00:20:37.118 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:37.118 "is_configured": true, 00:20:37.118 "data_offset": 2048, 00:20:37.118 "data_size": 63488 00:20:37.118 }, 00:20:37.118 { 00:20:37.118 "name": null, 00:20:37.118 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.118 "is_configured": false, 00:20:37.118 "data_offset": 2048, 00:20:37.118 "data_size": 63488 00:20:37.118 }, 00:20:37.118 { 00:20:37.118 "name": "BaseBdev3", 00:20:37.118 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:37.118 "is_configured": true, 00:20:37.118 "data_offset": 2048, 00:20:37.118 "data_size": 63488 00:20:37.118 }, 00:20:37.118 { 00:20:37.118 "name": "BaseBdev4", 00:20:37.118 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:37.118 "is_configured": true, 00:20:37.118 "data_offset": 2048, 00:20:37.118 "data_size": 63488 00:20:37.118 } 00:20:37.118 ] 00:20:37.118 }' 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:37.118 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:37.375 [2024-05-15 04:21:25.387319] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:37.632 [2024-05-15 04:21:25.479313] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:37.632 [2024-05-15 04:21:25.479370] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.632 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:37.889 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:37.889 "name": "raid_bdev1", 00:20:37.889 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:37.889 "strip_size_kb": 0, 00:20:37.889 "state": "online", 00:20:37.889 "raid_level": "raid1", 00:20:37.889 "superblock": true, 00:20:37.889 "num_base_bdevs": 4, 00:20:37.889 "num_base_bdevs_discovered": 2, 00:20:37.889 "num_base_bdevs_operational": 2, 00:20:37.889 "base_bdevs_list": [ 00:20:37.889 { 00:20:37.889 "name": null, 00:20:37.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.889 "is_configured": false, 00:20:37.889 "data_offset": 2048, 00:20:37.889 "data_size": 63488 00:20:37.889 }, 00:20:37.889 { 00:20:37.889 "name": null, 00:20:37.889 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:37.889 "is_configured": false, 00:20:37.889 "data_offset": 2048, 00:20:37.889 "data_size": 63488 00:20:37.889 }, 00:20:37.889 { 00:20:37.889 "name": "BaseBdev3", 00:20:37.889 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:37.889 "is_configured": true, 00:20:37.889 "data_offset": 2048, 00:20:37.889 "data_size": 63488 00:20:37.889 }, 00:20:37.889 { 00:20:37.889 "name": "BaseBdev4", 00:20:37.889 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:37.889 "is_configured": true, 00:20:37.889 "data_offset": 2048, 00:20:37.889 "data_size": 63488 00:20:37.889 } 00:20:37.889 ] 00:20:37.889 }' 00:20:37.889 04:21:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:37.889 04:21:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.494 04:21:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:38.753 [2024-05-15 04:21:26.527529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:38.753 [2024-05-15 04:21:26.527605] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.753 [2024-05-15 04:21:26.527634] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221a0c0 00:20:38.753 [2024-05-15 04:21:26.527648] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.753 [2024-05-15 04:21:26.528113] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.753 [2024-05-15 04:21:26.528142] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:38.753 [2024-05-15 04:21:26.528245] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:20:38.753 [2024-05-15 04:21:26.528273] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:20:38.753 [2024-05-15 04:21:26.528286] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:20:38.753 [2024-05-15 04:21:26.528313] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:38.753 [2024-05-15 04:21:26.533350] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23d2270 00:20:38.753 spare 00:20:38.753 [2024-05-15 04:21:26.534894] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:38.753 04:21:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # sleep 1 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.684 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:39.942 "name": "raid_bdev1", 00:20:39.942 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:39.942 "strip_size_kb": 0, 00:20:39.942 "state": "online", 00:20:39.942 "raid_level": "raid1", 00:20:39.942 "superblock": true, 00:20:39.942 "num_base_bdevs": 4, 00:20:39.942 "num_base_bdevs_discovered": 3, 00:20:39.942 "num_base_bdevs_operational": 3, 00:20:39.942 "process": { 00:20:39.942 "type": "rebuild", 00:20:39.942 "target": "spare", 00:20:39.942 "progress": { 00:20:39.942 "blocks": 24576, 00:20:39.942 "percent": 38 00:20:39.942 } 00:20:39.942 }, 00:20:39.942 "base_bdevs_list": [ 00:20:39.942 { 00:20:39.942 "name": "spare", 00:20:39.942 "uuid": "c8fcfb7d-22bd-500e-b964-3ec6421bcefe", 00:20:39.942 "is_configured": true, 00:20:39.942 "data_offset": 2048, 00:20:39.942 "data_size": 63488 00:20:39.942 }, 00:20:39.942 { 00:20:39.942 "name": null, 00:20:39.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:39.942 "is_configured": false, 00:20:39.942 "data_offset": 2048, 00:20:39.942 "data_size": 63488 00:20:39.942 }, 00:20:39.942 { 00:20:39.942 "name": "BaseBdev3", 00:20:39.942 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:39.942 "is_configured": true, 00:20:39.942 "data_offset": 2048, 00:20:39.942 "data_size": 63488 00:20:39.942 }, 00:20:39.942 { 00:20:39.942 "name": "BaseBdev4", 00:20:39.942 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:39.942 "is_configured": true, 00:20:39.942 "data_offset": 2048, 00:20:39.942 "data_size": 63488 00:20:39.942 } 00:20:39.942 ] 00:20:39.942 }' 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:39.942 04:21:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:20:40.201 [2024-05-15 04:21:28.106038] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:40.201 [2024-05-15 04:21:28.148130] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:40.201 [2024-05-15 04:21:28.148183] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.201 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.459 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:40.459 "name": "raid_bdev1", 00:20:40.459 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:40.459 "strip_size_kb": 0, 00:20:40.459 "state": "online", 00:20:40.459 "raid_level": "raid1", 00:20:40.459 "superblock": true, 00:20:40.459 "num_base_bdevs": 4, 00:20:40.459 "num_base_bdevs_discovered": 2, 00:20:40.459 "num_base_bdevs_operational": 2, 00:20:40.459 "base_bdevs_list": [ 00:20:40.459 { 00:20:40.459 "name": null, 00:20:40.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.459 "is_configured": false, 00:20:40.459 "data_offset": 2048, 00:20:40.459 "data_size": 63488 00:20:40.459 }, 00:20:40.459 { 00:20:40.459 "name": null, 00:20:40.459 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.459 "is_configured": false, 00:20:40.459 "data_offset": 2048, 00:20:40.459 "data_size": 63488 00:20:40.459 }, 00:20:40.459 { 00:20:40.459 "name": "BaseBdev3", 00:20:40.459 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:40.459 "is_configured": true, 00:20:40.459 "data_offset": 2048, 00:20:40.459 "data_size": 63488 00:20:40.459 }, 00:20:40.459 { 00:20:40.459 "name": "BaseBdev4", 00:20:40.459 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:40.459 "is_configured": true, 00:20:40.459 "data_offset": 2048, 00:20:40.459 "data_size": 63488 00:20:40.459 } 00:20:40.459 ] 00:20:40.459 }' 00:20:40.459 04:21:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:40.459 04:21:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.024 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:41.281 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:41.281 "name": "raid_bdev1", 00:20:41.281 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:41.281 "strip_size_kb": 0, 00:20:41.281 "state": "online", 00:20:41.281 "raid_level": "raid1", 00:20:41.281 "superblock": true, 00:20:41.281 "num_base_bdevs": 4, 00:20:41.281 "num_base_bdevs_discovered": 2, 00:20:41.281 "num_base_bdevs_operational": 2, 00:20:41.281 "base_bdevs_list": [ 00:20:41.281 { 00:20:41.281 "name": null, 00:20:41.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.281 "is_configured": false, 00:20:41.281 "data_offset": 2048, 00:20:41.282 "data_size": 63488 00:20:41.282 }, 00:20:41.282 { 00:20:41.282 "name": null, 00:20:41.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:41.282 "is_configured": false, 00:20:41.282 "data_offset": 2048, 00:20:41.282 "data_size": 63488 00:20:41.282 }, 00:20:41.282 { 00:20:41.282 "name": "BaseBdev3", 00:20:41.282 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:41.282 "is_configured": true, 00:20:41.282 "data_offset": 2048, 00:20:41.282 "data_size": 63488 00:20:41.282 }, 00:20:41.282 { 00:20:41.282 "name": "BaseBdev4", 00:20:41.282 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:41.282 "is_configured": true, 00:20:41.282 "data_offset": 2048, 00:20:41.282 "data_size": 63488 00:20:41.282 } 00:20:41.282 ] 00:20:41.282 }' 00:20:41.282 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:41.282 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:41.282 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:41.538 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:41.538 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:20:41.538 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:41.795 [2024-05-15 04:21:29.769934] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:41.795 [2024-05-15 04:21:29.769992] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.795 [2024-05-15 04:21:29.770019] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x221b7a0 00:20:41.795 [2024-05-15 04:21:29.770035] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.795 [2024-05-15 04:21:29.770445] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.795 [2024-05-15 04:21:29.770472] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:41.795 [2024-05-15 04:21:29.770557] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:20:41.795 [2024-05-15 04:21:29.770577] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:41.795 [2024-05-15 04:21:29.770588] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:41.795 BaseBdev1 00:20:41.795 04:21:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # sleep 1 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.167 04:21:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.167 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:43.167 "name": "raid_bdev1", 00:20:43.167 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:43.167 "strip_size_kb": 0, 00:20:43.167 "state": "online", 00:20:43.167 "raid_level": "raid1", 00:20:43.167 "superblock": true, 00:20:43.167 "num_base_bdevs": 4, 00:20:43.167 "num_base_bdevs_discovered": 2, 00:20:43.167 "num_base_bdevs_operational": 2, 00:20:43.167 "base_bdevs_list": [ 00:20:43.167 { 00:20:43.167 "name": null, 00:20:43.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.167 "is_configured": false, 00:20:43.167 "data_offset": 2048, 00:20:43.167 "data_size": 63488 00:20:43.167 }, 00:20:43.167 { 00:20:43.167 "name": null, 00:20:43.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.167 "is_configured": false, 00:20:43.167 "data_offset": 2048, 00:20:43.167 "data_size": 63488 00:20:43.167 }, 00:20:43.167 { 00:20:43.167 "name": "BaseBdev3", 00:20:43.167 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:43.167 "is_configured": true, 00:20:43.167 "data_offset": 2048, 00:20:43.167 "data_size": 63488 00:20:43.167 }, 00:20:43.167 { 00:20:43.167 "name": "BaseBdev4", 00:20:43.167 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:43.167 "is_configured": true, 00:20:43.167 "data_offset": 2048, 00:20:43.167 "data_size": 63488 00:20:43.167 } 00:20:43.167 ] 00:20:43.167 }' 00:20:43.167 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:43.167 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.733 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:43.991 "name": "raid_bdev1", 00:20:43.991 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:43.991 "strip_size_kb": 0, 00:20:43.991 "state": "online", 00:20:43.991 "raid_level": "raid1", 00:20:43.991 "superblock": true, 00:20:43.991 "num_base_bdevs": 4, 00:20:43.991 "num_base_bdevs_discovered": 2, 00:20:43.991 "num_base_bdevs_operational": 2, 00:20:43.991 "base_bdevs_list": [ 00:20:43.991 { 00:20:43.991 "name": null, 00:20:43.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.991 "is_configured": false, 00:20:43.991 "data_offset": 2048, 00:20:43.991 "data_size": 63488 00:20:43.991 }, 00:20:43.991 { 00:20:43.991 "name": null, 00:20:43.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:43.991 "is_configured": false, 00:20:43.991 "data_offset": 2048, 00:20:43.991 "data_size": 63488 00:20:43.991 }, 00:20:43.991 { 00:20:43.991 "name": "BaseBdev3", 00:20:43.991 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:43.991 "is_configured": true, 00:20:43.991 "data_offset": 2048, 00:20:43.991 "data_size": 63488 00:20:43.991 }, 00:20:43.991 { 00:20:43.991 "name": "BaseBdev4", 00:20:43.991 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:43.991 "is_configured": true, 00:20:43.991 "data_offset": 2048, 00:20:43.991 "data_size": 63488 00:20:43.991 } 00:20:43.991 ] 00:20:43.991 }' 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:43.991 04:21:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:20:44.248 [2024-05-15 04:21:32.152274] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:44.248 [2024-05-15 04:21:32.152436] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:20:44.248 [2024-05-15 04:21:32.152457] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:20:44.248 request: 00:20:44.248 { 00:20:44.248 "raid_bdev": "raid_bdev1", 00:20:44.248 "base_bdev": "BaseBdev1", 00:20:44.248 "method": "bdev_raid_add_base_bdev", 00:20:44.248 "req_id": 1 00:20:44.248 } 00:20:44.248 Got JSON-RPC error response 00:20:44.248 response: 00:20:44.248 { 00:20:44.248 "code": -22, 00:20:44.248 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:20:44.248 } 00:20:44.248 04:21:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:20:44.248 04:21:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:44.248 04:21:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:44.248 04:21:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:44.248 04:21:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.179 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:45.436 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:45.436 "name": "raid_bdev1", 00:20:45.436 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:45.436 "strip_size_kb": 0, 00:20:45.436 "state": "online", 00:20:45.436 "raid_level": "raid1", 00:20:45.436 "superblock": true, 00:20:45.436 "num_base_bdevs": 4, 00:20:45.436 "num_base_bdevs_discovered": 2, 00:20:45.436 "num_base_bdevs_operational": 2, 00:20:45.436 "base_bdevs_list": [ 00:20:45.436 { 00:20:45.437 "name": null, 00:20:45.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.437 "is_configured": false, 00:20:45.437 "data_offset": 2048, 00:20:45.437 "data_size": 63488 00:20:45.437 }, 00:20:45.437 { 00:20:45.437 "name": null, 00:20:45.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:45.437 "is_configured": false, 00:20:45.437 "data_offset": 2048, 00:20:45.437 "data_size": 63488 00:20:45.437 }, 00:20:45.437 { 00:20:45.437 "name": "BaseBdev3", 00:20:45.437 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:45.437 "is_configured": true, 00:20:45.437 "data_offset": 2048, 00:20:45.437 "data_size": 63488 00:20:45.437 }, 00:20:45.437 { 00:20:45.437 "name": "BaseBdev4", 00:20:45.437 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:45.437 "is_configured": true, 00:20:45.437 "data_offset": 2048, 00:20:45.437 "data_size": 63488 00:20:45.437 } 00:20:45.437 ] 00:20:45.437 }' 00:20:45.437 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:45.437 04:21:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.999 04:21:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:46.255 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:46.255 "name": "raid_bdev1", 00:20:46.255 "uuid": "a2fd9fa8-9924-4e88-b302-e5db4f6ee650", 00:20:46.255 "strip_size_kb": 0, 00:20:46.255 "state": "online", 00:20:46.255 "raid_level": "raid1", 00:20:46.255 "superblock": true, 00:20:46.255 "num_base_bdevs": 4, 00:20:46.255 "num_base_bdevs_discovered": 2, 00:20:46.255 "num_base_bdevs_operational": 2, 00:20:46.255 "base_bdevs_list": [ 00:20:46.255 { 00:20:46.255 "name": null, 00:20:46.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.255 "is_configured": false, 00:20:46.255 "data_offset": 2048, 00:20:46.255 "data_size": 63488 00:20:46.255 }, 00:20:46.255 { 00:20:46.255 "name": null, 00:20:46.255 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.255 "is_configured": false, 00:20:46.255 "data_offset": 2048, 00:20:46.255 "data_size": 63488 00:20:46.255 }, 00:20:46.255 { 00:20:46.255 "name": "BaseBdev3", 00:20:46.255 "uuid": "79d2ab11-d7cc-5893-816c-1b767131a59f", 00:20:46.255 "is_configured": true, 00:20:46.255 "data_offset": 2048, 00:20:46.255 "data_size": 63488 00:20:46.255 }, 00:20:46.255 { 00:20:46.255 "name": "BaseBdev4", 00:20:46.255 "uuid": "1187fa59-8750-507d-a5d3-e20f77a86d4e", 00:20:46.255 "is_configured": true, 00:20:46.255 "data_offset": 2048, 00:20:46.255 "data_size": 63488 00:20:46.255 } 00:20:46.255 ] 00:20:46.255 }' 00:20:46.255 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:46.255 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:46.255 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # killprocess 3921898 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@946 -- # '[' -z 3921898 ']' 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # kill -0 3921898 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # uname 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3921898 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3921898' 00:20:46.513 killing process with pid 3921898 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@965 -- # kill 3921898 00:20:46.513 Received shutdown signal, test time was about 60.000000 seconds 00:20:46.513 00:20:46.513 Latency(us) 00:20:46.513 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:20:46.513 =================================================================================================================== 00:20:46.513 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:20:46.513 [2024-05-15 04:21:34.312915] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:46.513 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@970 -- # wait 3921898 00:20:46.513 [2024-05-15 04:21:34.313024] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:46.513 [2024-05-15 04:21:34.313109] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:46.513 [2024-05-15 04:21:34.313139] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x221b330 name raid_bdev1, state offline 00:20:46.513 [2024-05-15 04:21:34.373170] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@785 -- # return 0 00:20:46.771 00:20:46.771 real 0m38.446s 00:20:46.771 user 0m56.556s 00:20:46.771 sys 0m6.155s 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1122 -- # xtrace_disable 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:46.771 ************************************ 00:20:46.771 END TEST raid_rebuild_test_sb 00:20:46.771 ************************************ 00:20:46.771 04:21:34 bdev_raid -- bdev/bdev_raid.sh@813 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:20:46.771 04:21:34 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:20:46.771 04:21:34 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:20:46.771 04:21:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:46.771 ************************************ 00:20:46.771 START TEST raid_rebuild_test_io 00:20:46.771 ************************************ 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 false true true 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=4 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local superblock=false 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local background_io=true 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local verify=true 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev3 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:46.771 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev4 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local strip_size 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local create_arg 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@579 -- # local data_offset 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # '[' false = true ']' 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # raid_pid=3927483 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@598 -- # waitforlisten 3927483 /var/tmp/spdk-raid.sock 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@827 -- # '[' -z 3927483 ']' 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:46.772 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:20:46.772 04:21:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:46.772 [2024-05-15 04:21:34.744181] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:20:46.772 [2024-05-15 04:21:34.744260] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3927483 ] 00:20:46.772 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:46.772 Zero copy mechanism will not be used. 00:20:47.029 [2024-05-15 04:21:34.818974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:47.029 [2024-05-15 04:21:34.928282] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:20:47.029 [2024-05-15 04:21:34.993399] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:47.029 [2024-05-15 04:21:34.993435] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:47.961 04:21:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:20:47.961 04:21:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # return 0 00:20:47.961 04:21:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:47.961 04:21:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:47.961 BaseBdev1_malloc 00:20:47.961 04:21:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:20:48.218 [2024-05-15 04:21:36.158231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:20:48.218 [2024-05-15 04:21:36.158299] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.218 [2024-05-15 04:21:36.158329] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2e000 00:20:48.218 [2024-05-15 04:21:36.158347] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.218 [2024-05-15 04:21:36.160055] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.218 [2024-05-15 04:21:36.160084] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:48.218 BaseBdev1 00:20:48.218 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:48.218 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:48.475 BaseBdev2_malloc 00:20:48.475 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:20:48.732 [2024-05-15 04:21:36.642293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:20:48.732 [2024-05-15 04:21:36.642348] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.732 [2024-05-15 04:21:36.642374] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd92c0 00:20:48.733 [2024-05-15 04:21:36.642389] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.733 [2024-05-15 04:21:36.643848] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.733 [2024-05-15 04:21:36.643896] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:48.733 BaseBdev2 00:20:48.733 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:48.733 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:48.990 BaseBdev3_malloc 00:20:48.990 04:21:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:20:49.248 [2024-05-15 04:21:37.134738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:20:49.248 [2024-05-15 04:21:37.134792] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.248 [2024-05-15 04:21:37.134818] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcdae30 00:20:49.248 [2024-05-15 04:21:37.134847] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.248 [2024-05-15 04:21:37.136246] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.248 [2024-05-15 04:21:37.136274] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:49.248 BaseBdev3 00:20:49.248 04:21:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:20:49.248 04:21:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:49.505 BaseBdev4_malloc 00:20:49.505 04:21:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:20:49.762 [2024-05-15 04:21:37.635136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:20:49.762 [2024-05-15 04:21:37.635206] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:49.762 [2024-05-15 04:21:37.635233] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcde1a0 00:20:49.762 [2024-05-15 04:21:37.635249] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:49.762 [2024-05-15 04:21:37.636616] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:49.762 [2024-05-15 04:21:37.636645] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:49.762 BaseBdev4 00:20:49.762 04:21:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:20:50.020 spare_malloc 00:20:50.020 04:21:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:20:50.277 spare_delay 00:20:50.277 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:20:50.534 [2024-05-15 04:21:38.376392] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:20:50.534 [2024-05-15 04:21:38.376457] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:50.534 [2024-05-15 04:21:38.376484] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcde4b0 00:20:50.535 [2024-05-15 04:21:38.376500] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:50.535 [2024-05-15 04:21:38.378072] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:50.535 [2024-05-15 04:21:38.378101] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:20:50.535 spare 00:20:50.535 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:20:50.792 [2024-05-15 04:21:38.613046] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:50.792 [2024-05-15 04:21:38.614269] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:50.792 [2024-05-15 04:21:38.614334] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:50.792 [2024-05-15 04:21:38.614395] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:50.792 [2024-05-15 04:21:38.614491] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd5e90 00:20:50.792 [2024-05-15 04:21:38.614508] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:50.792 [2024-05-15 04:21:38.614723] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb28410 00:20:50.792 [2024-05-15 04:21:38.614918] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd5e90 00:20:50.792 [2024-05-15 04:21:38.614935] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcd5e90 00:20:50.792 [2024-05-15 04:21:38.615067] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.792 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.050 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:51.050 "name": "raid_bdev1", 00:20:51.050 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:51.050 "strip_size_kb": 0, 00:20:51.050 "state": "online", 00:20:51.050 "raid_level": "raid1", 00:20:51.050 "superblock": false, 00:20:51.050 "num_base_bdevs": 4, 00:20:51.050 "num_base_bdevs_discovered": 4, 00:20:51.050 "num_base_bdevs_operational": 4, 00:20:51.050 "base_bdevs_list": [ 00:20:51.050 { 00:20:51.050 "name": "BaseBdev1", 00:20:51.050 "uuid": "3526bebd-47d3-59c2-9003-6cb03d1cf798", 00:20:51.050 "is_configured": true, 00:20:51.050 "data_offset": 0, 00:20:51.050 "data_size": 65536 00:20:51.050 }, 00:20:51.050 { 00:20:51.050 "name": "BaseBdev2", 00:20:51.050 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:51.050 "is_configured": true, 00:20:51.050 "data_offset": 0, 00:20:51.050 "data_size": 65536 00:20:51.050 }, 00:20:51.050 { 00:20:51.050 "name": "BaseBdev3", 00:20:51.050 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:51.050 "is_configured": true, 00:20:51.050 "data_offset": 0, 00:20:51.050 "data_size": 65536 00:20:51.050 }, 00:20:51.050 { 00:20:51.050 "name": "BaseBdev4", 00:20:51.050 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:51.050 "is_configured": true, 00:20:51.050 "data_offset": 0, 00:20:51.050 "data_size": 65536 00:20:51.050 } 00:20:51.050 ] 00:20:51.050 }' 00:20:51.050 04:21:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:51.050 04:21:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:51.616 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:51.616 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:20:51.874 [2024-05-15 04:21:39.652095] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:51.874 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=65536 00:20:51.874 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.874 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:20:52.131 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@619 -- # data_offset=0 00:20:52.131 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@621 -- # '[' true = true ']' 00:20:52.131 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:20:52.131 04:21:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:52.131 [2024-05-15 04:21:40.011549] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb26930 00:20:52.131 I/O size of 3145728 is greater than zero copy threshold (65536). 00:20:52.131 Zero copy mechanism will not be used. 00:20:52.131 Running I/O for 60 seconds... 00:20:52.389 [2024-05-15 04:21:40.152545] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:52.389 [2024-05-15 04:21:40.165160] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb26930 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.389 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:52.647 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:52.647 "name": "raid_bdev1", 00:20:52.647 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:52.647 "strip_size_kb": 0, 00:20:52.647 "state": "online", 00:20:52.647 "raid_level": "raid1", 00:20:52.647 "superblock": false, 00:20:52.647 "num_base_bdevs": 4, 00:20:52.647 "num_base_bdevs_discovered": 3, 00:20:52.647 "num_base_bdevs_operational": 3, 00:20:52.647 "base_bdevs_list": [ 00:20:52.647 { 00:20:52.647 "name": null, 00:20:52.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:52.647 "is_configured": false, 00:20:52.647 "data_offset": 0, 00:20:52.647 "data_size": 65536 00:20:52.647 }, 00:20:52.647 { 00:20:52.647 "name": "BaseBdev2", 00:20:52.647 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:52.647 "is_configured": true, 00:20:52.647 "data_offset": 0, 00:20:52.647 "data_size": 65536 00:20:52.647 }, 00:20:52.647 { 00:20:52.647 "name": "BaseBdev3", 00:20:52.647 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:52.647 "is_configured": true, 00:20:52.647 "data_offset": 0, 00:20:52.647 "data_size": 65536 00:20:52.647 }, 00:20:52.647 { 00:20:52.647 "name": "BaseBdev4", 00:20:52.647 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:52.647 "is_configured": true, 00:20:52.647 "data_offset": 0, 00:20:52.647 "data_size": 65536 00:20:52.647 } 00:20:52.647 ] 00:20:52.647 }' 00:20:52.647 04:21:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:52.647 04:21:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:53.212 04:21:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:53.469 [2024-05-15 04:21:41.253294] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:53.469 04:21:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@647 -- # sleep 1 00:20:53.469 [2024-05-15 04:21:41.299600] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb29070 00:20:53.469 [2024-05-15 04:21:41.302004] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:53.469 [2024-05-15 04:21:41.413456] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:53.469 [2024-05-15 04:21:41.415027] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:53.727 [2024-05-15 04:21:41.648132] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:53.727 [2024-05-15 04:21:41.648485] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:53.985 [2024-05-15 04:21:41.897433] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:53.985 [2024-05-15 04:21:41.897865] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:54.242 [2024-05-15 04:21:42.108484] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:54.242 [2024-05-15 04:21:42.108801] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.499 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.499 [2024-05-15 04:21:42.464966] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:54.757 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:54.757 "name": "raid_bdev1", 00:20:54.757 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:54.757 "strip_size_kb": 0, 00:20:54.757 "state": "online", 00:20:54.757 "raid_level": "raid1", 00:20:54.757 "superblock": false, 00:20:54.757 "num_base_bdevs": 4, 00:20:54.757 "num_base_bdevs_discovered": 4, 00:20:54.757 "num_base_bdevs_operational": 4, 00:20:54.757 "process": { 00:20:54.757 "type": "rebuild", 00:20:54.757 "target": "spare", 00:20:54.757 "progress": { 00:20:54.757 "blocks": 14336, 00:20:54.757 "percent": 21 00:20:54.757 } 00:20:54.757 }, 00:20:54.757 "base_bdevs_list": [ 00:20:54.757 { 00:20:54.757 "name": "spare", 00:20:54.757 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:20:54.757 "is_configured": true, 00:20:54.757 "data_offset": 0, 00:20:54.757 "data_size": 65536 00:20:54.757 }, 00:20:54.757 { 00:20:54.757 "name": "BaseBdev2", 00:20:54.757 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:54.757 "is_configured": true, 00:20:54.757 "data_offset": 0, 00:20:54.757 "data_size": 65536 00:20:54.757 }, 00:20:54.757 { 00:20:54.757 "name": "BaseBdev3", 00:20:54.757 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:54.757 "is_configured": true, 00:20:54.757 "data_offset": 0, 00:20:54.757 "data_size": 65536 00:20:54.757 }, 00:20:54.757 { 00:20:54.757 "name": "BaseBdev4", 00:20:54.757 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:54.757 "is_configured": true, 00:20:54.757 "data_offset": 0, 00:20:54.757 "data_size": 65536 00:20:54.757 } 00:20:54.757 ] 00:20:54.757 }' 00:20:54.757 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:54.757 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:54.757 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:54.757 [2024-05-15 04:21:42.602000] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:54.757 [2024-05-15 04:21:42.602297] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:54.758 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:54.758 04:21:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:20:55.016 [2024-05-15 04:21:42.825470] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:20:55.016 [2024-05-15 04:21:42.955015] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:20:55.016 [2024-05-15 04:21:42.967236] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:55.016 [2024-05-15 04:21:42.982947] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb26930 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.016 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.273 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:20:55.273 "name": "raid_bdev1", 00:20:55.273 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:55.273 "strip_size_kb": 0, 00:20:55.273 "state": "online", 00:20:55.273 "raid_level": "raid1", 00:20:55.273 "superblock": false, 00:20:55.273 "num_base_bdevs": 4, 00:20:55.273 "num_base_bdevs_discovered": 3, 00:20:55.273 "num_base_bdevs_operational": 3, 00:20:55.273 "base_bdevs_list": [ 00:20:55.274 { 00:20:55.274 "name": null, 00:20:55.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.274 "is_configured": false, 00:20:55.274 "data_offset": 0, 00:20:55.274 "data_size": 65536 00:20:55.274 }, 00:20:55.274 { 00:20:55.274 "name": "BaseBdev2", 00:20:55.274 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:55.274 "is_configured": true, 00:20:55.274 "data_offset": 0, 00:20:55.274 "data_size": 65536 00:20:55.274 }, 00:20:55.274 { 00:20:55.274 "name": "BaseBdev3", 00:20:55.274 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:55.274 "is_configured": true, 00:20:55.274 "data_offset": 0, 00:20:55.274 "data_size": 65536 00:20:55.274 }, 00:20:55.274 { 00:20:55.274 "name": "BaseBdev4", 00:20:55.274 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:55.274 "is_configured": true, 00:20:55.274 "data_offset": 0, 00:20:55.274 "data_size": 65536 00:20:55.274 } 00:20:55.274 ] 00:20:55.274 }' 00:20:55.274 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:20:55.274 04:21:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.838 04:21:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:56.403 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:56.403 "name": "raid_bdev1", 00:20:56.403 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:56.403 "strip_size_kb": 0, 00:20:56.403 "state": "online", 00:20:56.403 "raid_level": "raid1", 00:20:56.403 "superblock": false, 00:20:56.403 "num_base_bdevs": 4, 00:20:56.403 "num_base_bdevs_discovered": 3, 00:20:56.403 "num_base_bdevs_operational": 3, 00:20:56.403 "base_bdevs_list": [ 00:20:56.403 { 00:20:56.403 "name": null, 00:20:56.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.403 "is_configured": false, 00:20:56.403 "data_offset": 0, 00:20:56.403 "data_size": 65536 00:20:56.403 }, 00:20:56.403 { 00:20:56.403 "name": "BaseBdev2", 00:20:56.403 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:56.404 "is_configured": true, 00:20:56.404 "data_offset": 0, 00:20:56.404 "data_size": 65536 00:20:56.404 }, 00:20:56.404 { 00:20:56.404 "name": "BaseBdev3", 00:20:56.404 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:56.404 "is_configured": true, 00:20:56.404 "data_offset": 0, 00:20:56.404 "data_size": 65536 00:20:56.404 }, 00:20:56.404 { 00:20:56.404 "name": "BaseBdev4", 00:20:56.404 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:56.404 "is_configured": true, 00:20:56.404 "data_offset": 0, 00:20:56.404 "data_size": 65536 00:20:56.404 } 00:20:56.404 ] 00:20:56.404 }' 00:20:56.404 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:56.404 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:20:56.404 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:56.404 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:20:56.404 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:20:56.665 [2024-05-15 04:21:44.512222] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:20:56.665 04:21:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # sleep 1 00:20:56.665 [2024-05-15 04:21:44.584971] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2c730 00:20:56.665 [2024-05-15 04:21:44.586651] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:20:56.932 [2024-05-15 04:21:44.698242] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:56.932 [2024-05-15 04:21:44.698803] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:20:56.932 [2024-05-15 04:21:44.911464] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:56.932 [2024-05-15 04:21:44.912293] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:20:57.498 [2024-05-15 04:21:45.263372] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:57.498 [2024-05-15 04:21:45.264945] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.755 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.012 [2024-05-15 04:21:45.805915] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:20:58.012 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:58.012 "name": "raid_bdev1", 00:20:58.012 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:58.012 "strip_size_kb": 0, 00:20:58.012 "state": "online", 00:20:58.012 "raid_level": "raid1", 00:20:58.012 "superblock": false, 00:20:58.012 "num_base_bdevs": 4, 00:20:58.013 "num_base_bdevs_discovered": 4, 00:20:58.013 "num_base_bdevs_operational": 4, 00:20:58.013 "process": { 00:20:58.013 "type": "rebuild", 00:20:58.013 "target": "spare", 00:20:58.013 "progress": { 00:20:58.013 "blocks": 12288, 00:20:58.013 "percent": 18 00:20:58.013 } 00:20:58.013 }, 00:20:58.013 "base_bdevs_list": [ 00:20:58.013 { 00:20:58.013 "name": "spare", 00:20:58.013 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:20:58.013 "is_configured": true, 00:20:58.013 "data_offset": 0, 00:20:58.013 "data_size": 65536 00:20:58.013 }, 00:20:58.013 { 00:20:58.013 "name": "BaseBdev2", 00:20:58.013 "uuid": "4d533fa3-2226-5dcb-a669-005cf3bd41aa", 00:20:58.013 "is_configured": true, 00:20:58.013 "data_offset": 0, 00:20:58.013 "data_size": 65536 00:20:58.013 }, 00:20:58.013 { 00:20:58.013 "name": "BaseBdev3", 00:20:58.013 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:58.013 "is_configured": true, 00:20:58.013 "data_offset": 0, 00:20:58.013 "data_size": 65536 00:20:58.013 }, 00:20:58.013 { 00:20:58.013 "name": "BaseBdev4", 00:20:58.013 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:58.013 "is_configured": true, 00:20:58.013 "data_offset": 0, 00:20:58.013 "data_size": 65536 00:20:58.013 } 00:20:58.013 ] 00:20:58.013 }' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@666 -- # '[' false = true ']' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=4 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@693 -- # '[' 4 -gt 2 ']' 00:20:58.013 04:21:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@695 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:58.013 [2024-05-15 04:21:45.946437] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:20:58.270 [2024-05-15 04:21:46.108906] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:58.270 [2024-05-15 04:21:46.160593] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:58.270 [2024-05-15 04:21:46.162035] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:20:58.270 [2024-05-15 04:21:46.270969] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xb26930 00:20:58.270 [2024-05-15 04:21:46.270995] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xb2c730 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # base_bdevs[1]= 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@699 -- # (( num_base_bdevs_operational-- )) 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@702 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.528 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:58.528 [2024-05-15 04:21:46.393398] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:58.528 [2024-05-15 04:21:46.393598] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:20:58.805 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:58.805 "name": "raid_bdev1", 00:20:58.805 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:58.805 "strip_size_kb": 0, 00:20:58.806 "state": "online", 00:20:58.806 "raid_level": "raid1", 00:20:58.806 "superblock": false, 00:20:58.806 "num_base_bdevs": 4, 00:20:58.806 "num_base_bdevs_discovered": 3, 00:20:58.806 "num_base_bdevs_operational": 3, 00:20:58.806 "process": { 00:20:58.806 "type": "rebuild", 00:20:58.806 "target": "spare", 00:20:58.806 "progress": { 00:20:58.806 "blocks": 24576, 00:20:58.806 "percent": 37 00:20:58.806 } 00:20:58.806 }, 00:20:58.806 "base_bdevs_list": [ 00:20:58.806 { 00:20:58.806 "name": "spare", 00:20:58.806 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:20:58.806 "is_configured": true, 00:20:58.806 "data_offset": 0, 00:20:58.806 "data_size": 65536 00:20:58.806 }, 00:20:58.806 { 00:20:58.806 "name": null, 00:20:58.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.806 "is_configured": false, 00:20:58.806 "data_offset": 0, 00:20:58.806 "data_size": 65536 00:20:58.806 }, 00:20:58.806 { 00:20:58.806 "name": "BaseBdev3", 00:20:58.806 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:58.806 "is_configured": true, 00:20:58.806 "data_offset": 0, 00:20:58.806 "data_size": 65536 00:20:58.806 }, 00:20:58.806 { 00:20:58.806 "name": "BaseBdev4", 00:20:58.806 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:58.806 "is_configured": true, 00:20:58.806 "data_offset": 0, 00:20:58.806 "data_size": 65536 00:20:58.806 } 00:20:58.806 ] 00:20:58.806 }' 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local timeout=782 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.806 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:59.104 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:20:59.104 "name": "raid_bdev1", 00:20:59.104 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:20:59.104 "strip_size_kb": 0, 00:20:59.104 "state": "online", 00:20:59.104 "raid_level": "raid1", 00:20:59.104 "superblock": false, 00:20:59.104 "num_base_bdevs": 4, 00:20:59.104 "num_base_bdevs_discovered": 3, 00:20:59.104 "num_base_bdevs_operational": 3, 00:20:59.104 "process": { 00:20:59.104 "type": "rebuild", 00:20:59.104 "target": "spare", 00:20:59.104 "progress": { 00:20:59.104 "blocks": 30720, 00:20:59.104 "percent": 46 00:20:59.104 } 00:20:59.104 }, 00:20:59.104 "base_bdevs_list": [ 00:20:59.104 { 00:20:59.104 "name": "spare", 00:20:59.104 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:20:59.104 "is_configured": true, 00:20:59.104 "data_offset": 0, 00:20:59.104 "data_size": 65536 00:20:59.104 }, 00:20:59.104 { 00:20:59.104 "name": null, 00:20:59.104 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.104 "is_configured": false, 00:20:59.104 "data_offset": 0, 00:20:59.104 "data_size": 65536 00:20:59.104 }, 00:20:59.104 { 00:20:59.104 "name": "BaseBdev3", 00:20:59.105 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:20:59.105 "is_configured": true, 00:20:59.105 "data_offset": 0, 00:20:59.105 "data_size": 65536 00:20:59.105 }, 00:20:59.105 { 00:20:59.105 "name": "BaseBdev4", 00:20:59.105 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:20:59.105 "is_configured": true, 00:20:59.105 "data_offset": 0, 00:20:59.105 "data_size": 65536 00:20:59.105 } 00:20:59.105 ] 00:20:59.105 }' 00:20:59.105 04:21:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:20:59.105 04:21:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:20:59.105 04:21:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:20:59.105 04:21:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:20:59.105 04:21:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:20:59.362 [2024-05-15 04:21:47.127832] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:59.362 [2024-05-15 04:21:47.128184] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:20:59.362 [2024-05-15 04:21:47.342199] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:59.362 [2024-05-15 04:21:47.342501] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:20:59.619 [2024-05-15 04:21:47.453769] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:20:59.877 [2024-05-15 04:21:47.707959] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.135 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:00.393 [2024-05-15 04:21:48.163917] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:21:00.393 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:00.393 "name": "raid_bdev1", 00:21:00.393 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:21:00.393 "strip_size_kb": 0, 00:21:00.393 "state": "online", 00:21:00.393 "raid_level": "raid1", 00:21:00.393 "superblock": false, 00:21:00.393 "num_base_bdevs": 4, 00:21:00.393 "num_base_bdevs_discovered": 3, 00:21:00.393 "num_base_bdevs_operational": 3, 00:21:00.393 "process": { 00:21:00.393 "type": "rebuild", 00:21:00.393 "target": "spare", 00:21:00.393 "progress": { 00:21:00.393 "blocks": 51200, 00:21:00.393 "percent": 78 00:21:00.393 } 00:21:00.393 }, 00:21:00.393 "base_bdevs_list": [ 00:21:00.393 { 00:21:00.393 "name": "spare", 00:21:00.393 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:21:00.393 "is_configured": true, 00:21:00.393 "data_offset": 0, 00:21:00.393 "data_size": 65536 00:21:00.393 }, 00:21:00.393 { 00:21:00.393 "name": null, 00:21:00.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.393 "is_configured": false, 00:21:00.393 "data_offset": 0, 00:21:00.393 "data_size": 65536 00:21:00.393 }, 00:21:00.393 { 00:21:00.393 "name": "BaseBdev3", 00:21:00.393 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:21:00.393 "is_configured": true, 00:21:00.393 "data_offset": 0, 00:21:00.393 "data_size": 65536 00:21:00.393 }, 00:21:00.393 { 00:21:00.393 "name": "BaseBdev4", 00:21:00.393 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:21:00.393 "is_configured": true, 00:21:00.393 "data_offset": 0, 00:21:00.393 "data_size": 65536 00:21:00.393 } 00:21:00.393 ] 00:21:00.393 }' 00:21:00.393 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:00.393 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:00.393 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:00.393 [2024-05-15 04:21:48.386075] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:00.650 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:00.650 04:21:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:21:00.907 [2024-05-15 04:21:48.707397] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:21:01.166 [2024-05-15 04:21:49.157799] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:01.424 [2024-05-15 04:21:49.265197] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:01.424 [2024-05-15 04:21:49.267962] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.424 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.686 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:01.686 "name": "raid_bdev1", 00:21:01.686 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:21:01.686 "strip_size_kb": 0, 00:21:01.686 "state": "online", 00:21:01.686 "raid_level": "raid1", 00:21:01.686 "superblock": false, 00:21:01.686 "num_base_bdevs": 4, 00:21:01.686 "num_base_bdevs_discovered": 3, 00:21:01.686 "num_base_bdevs_operational": 3, 00:21:01.686 "base_bdevs_list": [ 00:21:01.686 { 00:21:01.686 "name": "spare", 00:21:01.686 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:21:01.686 "is_configured": true, 00:21:01.686 "data_offset": 0, 00:21:01.686 "data_size": 65536 00:21:01.686 }, 00:21:01.686 { 00:21:01.686 "name": null, 00:21:01.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.686 "is_configured": false, 00:21:01.686 "data_offset": 0, 00:21:01.687 "data_size": 65536 00:21:01.687 }, 00:21:01.687 { 00:21:01.687 "name": "BaseBdev3", 00:21:01.687 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:21:01.687 "is_configured": true, 00:21:01.687 "data_offset": 0, 00:21:01.687 "data_size": 65536 00:21:01.687 }, 00:21:01.687 { 00:21:01.687 "name": "BaseBdev4", 00:21:01.687 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:21:01.687 "is_configured": true, 00:21:01.687 "data_offset": 0, 00:21:01.687 "data_size": 65536 00:21:01.687 } 00:21:01.687 ] 00:21:01.687 }' 00:21:01.687 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@709 -- # break 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.947 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.205 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:02.205 "name": "raid_bdev1", 00:21:02.205 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:21:02.205 "strip_size_kb": 0, 00:21:02.205 "state": "online", 00:21:02.205 "raid_level": "raid1", 00:21:02.205 "superblock": false, 00:21:02.205 "num_base_bdevs": 4, 00:21:02.205 "num_base_bdevs_discovered": 3, 00:21:02.205 "num_base_bdevs_operational": 3, 00:21:02.205 "base_bdevs_list": [ 00:21:02.205 { 00:21:02.205 "name": "spare", 00:21:02.205 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:21:02.205 "is_configured": true, 00:21:02.205 "data_offset": 0, 00:21:02.205 "data_size": 65536 00:21:02.205 }, 00:21:02.205 { 00:21:02.205 "name": null, 00:21:02.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.205 "is_configured": false, 00:21:02.205 "data_offset": 0, 00:21:02.205 "data_size": 65536 00:21:02.205 }, 00:21:02.205 { 00:21:02.205 "name": "BaseBdev3", 00:21:02.205 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:21:02.205 "is_configured": true, 00:21:02.205 "data_offset": 0, 00:21:02.205 "data_size": 65536 00:21:02.205 }, 00:21:02.205 { 00:21:02.205 "name": "BaseBdev4", 00:21:02.205 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:21:02.205 "is_configured": true, 00:21:02.205 "data_offset": 0, 00:21:02.205 "data_size": 65536 00:21:02.205 } 00:21:02.205 ] 00:21:02.205 }' 00:21:02.205 04:21:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.205 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:02.463 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:02.463 "name": "raid_bdev1", 00:21:02.463 "uuid": "341f0a6f-63d6-4988-8fbc-75c735340a9a", 00:21:02.463 "strip_size_kb": 0, 00:21:02.463 "state": "online", 00:21:02.463 "raid_level": "raid1", 00:21:02.463 "superblock": false, 00:21:02.463 "num_base_bdevs": 4, 00:21:02.463 "num_base_bdevs_discovered": 3, 00:21:02.463 "num_base_bdevs_operational": 3, 00:21:02.463 "base_bdevs_list": [ 00:21:02.463 { 00:21:02.464 "name": "spare", 00:21:02.464 "uuid": "704a795e-3289-5c44-ad1f-497addb7b005", 00:21:02.464 "is_configured": true, 00:21:02.464 "data_offset": 0, 00:21:02.464 "data_size": 65536 00:21:02.464 }, 00:21:02.464 { 00:21:02.464 "name": null, 00:21:02.464 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.464 "is_configured": false, 00:21:02.464 "data_offset": 0, 00:21:02.464 "data_size": 65536 00:21:02.464 }, 00:21:02.464 { 00:21:02.464 "name": "BaseBdev3", 00:21:02.464 "uuid": "e4e7ddb3-301a-5457-b268-23f8ea33f4c7", 00:21:02.464 "is_configured": true, 00:21:02.464 "data_offset": 0, 00:21:02.464 "data_size": 65536 00:21:02.464 }, 00:21:02.464 { 00:21:02.464 "name": "BaseBdev4", 00:21:02.464 "uuid": "22021bab-e4d2-572c-95b6-86df6f9c1239", 00:21:02.464 "is_configured": true, 00:21:02.464 "data_offset": 0, 00:21:02.464 "data_size": 65536 00:21:02.464 } 00:21:02.464 ] 00:21:02.464 }' 00:21:02.464 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:02.464 04:21:50 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:03.029 04:21:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:03.287 [2024-05-15 04:21:51.201912] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:03.287 [2024-05-15 04:21:51.201949] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:03.287 00:21:03.287 Latency(us) 00:21:03.287 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:03.287 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:03.287 raid_bdev1 : 11.21 94.66 283.97 0.00 0.00 13933.66 271.55 119615.34 00:21:03.287 =================================================================================================================== 00:21:03.287 Total : 94.66 283.97 0.00 0.00 13933.66 271.55 119615.34 00:21:03.287 [2024-05-15 04:21:51.253834] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:03.287 [2024-05-15 04:21:51.253870] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:03.287 [2024-05-15 04:21:51.253966] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:03.287 [2024-05-15 04:21:51.253984] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd5e90 name raid_bdev1, state offline 00:21:03.287 0 00:21:03.287 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.287 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # jq length 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # '[' true = true ']' 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:03.545 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:03.803 /dev/nbd0 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:03.803 1+0 records in 00:21:03.803 1+0 records out 00:21:03.803 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000169396 s, 24.2 MB/s 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:21:03.803 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' -z '' ']' 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@728 -- # continue 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev3 ']' 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:04.061 04:21:51 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:21:04.061 /dev/nbd1 00:21:04.061 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:04.320 1+0 records in 00:21:04.320 1+0 records out 00:21:04.320 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237953 s, 17.2 MB/s 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:04.320 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev4 ']' 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:04.578 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:21:04.836 /dev/nbd1 00:21:04.836 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:04.836 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@865 -- # local i 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # break 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:04.837 1+0 records in 00:21:04.837 1+0 records out 00:21:04.837 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189509 s, 21.6 MB/s 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # size=4096 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # return 0 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:04.837 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:05.095 04:21:52 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # '[' false = true ']' 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@783 -- # killprocess 3927483 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@946 -- # '[' -z 3927483 ']' 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # kill -0 3927483 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # uname 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3927483 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3927483' 00:21:05.353 killing process with pid 3927483 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@965 -- # kill 3927483 00:21:05.353 Received shutdown signal, test time was about 13.200396 seconds 00:21:05.353 00:21:05.353 Latency(us) 00:21:05.353 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:05.353 =================================================================================================================== 00:21:05.353 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:05.353 [2024-05-15 04:21:53.246355] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:05.353 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@970 -- # wait 3927483 00:21:05.353 [2024-05-15 04:21:53.299134] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:05.611 04:21:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@785 -- # return 0 00:21:05.611 00:21:05.611 real 0m18.882s 00:21:05.611 user 0m29.868s 00:21:05.611 sys 0m2.768s 00:21:05.611 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:05.611 04:21:53 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:21:05.611 ************************************ 00:21:05.611 END TEST raid_rebuild_test_io 00:21:05.611 ************************************ 00:21:05.611 04:21:53 bdev_raid -- bdev/bdev_raid.sh@814 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:21:05.612 04:21:53 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:21:05.612 04:21:53 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:05.612 04:21:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:05.871 ************************************ 00:21:05.871 START TEST raid_rebuild_test_sb_io 00:21:05.871 ************************************ 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 4 true true true 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=4 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local background_io=true 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local verify=true 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev3 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # echo BaseBdev4 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local strip_size 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local create_arg 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@579 -- # local data_offset 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # raid_pid=3929980 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@598 -- # waitforlisten 3929980 /var/tmp/spdk-raid.sock 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@827 -- # '[' -z 3929980 ']' 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:05.871 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:05.871 04:21:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:05.871 [2024-05-15 04:21:53.684641] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:21:05.871 [2024-05-15 04:21:53.684728] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3929980 ] 00:21:05.871 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:05.871 Zero copy mechanism will not be used. 00:21:05.871 [2024-05-15 04:21:53.767027] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:06.129 [2024-05-15 04:21:53.889219] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:06.129 [2024-05-15 04:21:53.959207] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:06.129 [2024-05-15 04:21:53.959258] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:06.695 04:21:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:06.695 04:21:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # return 0 00:21:06.695 04:21:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:21:06.695 04:21:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:06.953 BaseBdev1_malloc 00:21:06.953 04:21:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:07.211 [2024-05-15 04:21:55.123212] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:07.211 [2024-05-15 04:21:55.123282] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.211 [2024-05-15 04:21:55.123322] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a6c000 00:21:07.211 [2024-05-15 04:21:55.123338] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.211 [2024-05-15 04:21:55.125052] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.211 [2024-05-15 04:21:55.125081] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:07.211 BaseBdev1 00:21:07.211 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:21:07.211 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:07.469 BaseBdev2_malloc 00:21:07.469 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:07.727 [2024-05-15 04:21:55.612167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:07.727 [2024-05-15 04:21:55.612235] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.727 [2024-05-15 04:21:55.612260] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c172c0 00:21:07.727 [2024-05-15 04:21:55.612276] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.727 [2024-05-15 04:21:55.613637] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.727 [2024-05-15 04:21:55.613665] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:07.727 BaseBdev2 00:21:07.727 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:21:07.727 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:07.985 BaseBdev3_malloc 00:21:07.985 04:21:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:21:08.243 [2024-05-15 04:21:56.092191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:21:08.243 [2024-05-15 04:21:56.092256] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.243 [2024-05-15 04:21:56.092282] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c18e30 00:21:08.243 [2024-05-15 04:21:56.092298] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.243 [2024-05-15 04:21:56.093735] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.243 [2024-05-15 04:21:56.093764] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:08.243 BaseBdev3 00:21:08.243 04:21:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:21:08.243 04:21:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:08.501 BaseBdev4_malloc 00:21:08.501 04:21:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:21:08.759 [2024-05-15 04:21:56.584361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:21:08.759 [2024-05-15 04:21:56.584435] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:08.759 [2024-05-15 04:21:56.584465] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1c1a0 00:21:08.759 [2024-05-15 04:21:56.584481] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:08.759 [2024-05-15 04:21:56.586123] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:08.759 [2024-05-15 04:21:56.586152] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:08.759 BaseBdev4 00:21:08.759 04:21:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:09.017 spare_malloc 00:21:09.017 04:21:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:09.276 spare_delay 00:21:09.276 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:09.546 [2024-05-15 04:21:57.308884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:09.546 [2024-05-15 04:21:57.308947] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.546 [2024-05-15 04:21:57.308979] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1c4b0 00:21:09.546 [2024-05-15 04:21:57.308995] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.546 [2024-05-15 04:21:57.310617] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.546 [2024-05-15 04:21:57.310644] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:09.546 spare 00:21:09.546 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:21:09.546 [2024-05-15 04:21:57.553584] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:09.546 [2024-05-15 04:21:57.555009] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:09.546 [2024-05-15 04:21:57.555076] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:09.546 [2024-05-15 04:21:57.555147] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:09.546 [2024-05-15 04:21:57.555387] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c13e90 00:21:09.546 [2024-05-15 04:21:57.555404] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:09.546 [2024-05-15 04:21:57.555650] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a63150 00:21:09.546 [2024-05-15 04:21:57.555875] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c13e90 00:21:09.546 [2024-05-15 04:21:57.555892] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c13e90 00:21:09.546 [2024-05-15 04:21:57.556027] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=4 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.803 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:10.061 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:10.061 "name": "raid_bdev1", 00:21:10.061 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:10.061 "strip_size_kb": 0, 00:21:10.061 "state": "online", 00:21:10.061 "raid_level": "raid1", 00:21:10.061 "superblock": true, 00:21:10.061 "num_base_bdevs": 4, 00:21:10.061 "num_base_bdevs_discovered": 4, 00:21:10.061 "num_base_bdevs_operational": 4, 00:21:10.061 "base_bdevs_list": [ 00:21:10.061 { 00:21:10.061 "name": "BaseBdev1", 00:21:10.061 "uuid": "4478cc98-1638-5533-9a67-29c1f2791105", 00:21:10.061 "is_configured": true, 00:21:10.061 "data_offset": 2048, 00:21:10.061 "data_size": 63488 00:21:10.061 }, 00:21:10.061 { 00:21:10.061 "name": "BaseBdev2", 00:21:10.061 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:10.061 "is_configured": true, 00:21:10.061 "data_offset": 2048, 00:21:10.061 "data_size": 63488 00:21:10.061 }, 00:21:10.061 { 00:21:10.061 "name": "BaseBdev3", 00:21:10.061 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:10.061 "is_configured": true, 00:21:10.061 "data_offset": 2048, 00:21:10.061 "data_size": 63488 00:21:10.061 }, 00:21:10.061 { 00:21:10.061 "name": "BaseBdev4", 00:21:10.061 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:10.061 "is_configured": true, 00:21:10.061 "data_offset": 2048, 00:21:10.061 "data_size": 63488 00:21:10.061 } 00:21:10.061 ] 00:21:10.061 }' 00:21:10.061 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:10.061 04:21:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:10.626 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:10.626 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:21:10.626 [2024-05-15 04:21:58.588547] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:10.626 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=63488 00:21:10.626 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.626 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:10.883 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@619 -- # data_offset=2048 00:21:10.883 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@621 -- # '[' true = true ']' 00:21:10.883 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:10.883 04:21:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:11.141 [2024-05-15 04:21:58.971979] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c164d0 00:21:11.141 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:11.141 Zero copy mechanism will not be used. 00:21:11.141 Running I/O for 60 seconds... 00:21:11.141 [2024-05-15 04:21:59.092037] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:11.141 [2024-05-15 04:21:59.092274] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c164d0 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.141 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:11.707 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:11.707 "name": "raid_bdev1", 00:21:11.707 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:11.707 "strip_size_kb": 0, 00:21:11.707 "state": "online", 00:21:11.707 "raid_level": "raid1", 00:21:11.707 "superblock": true, 00:21:11.707 "num_base_bdevs": 4, 00:21:11.707 "num_base_bdevs_discovered": 3, 00:21:11.707 "num_base_bdevs_operational": 3, 00:21:11.707 "base_bdevs_list": [ 00:21:11.707 { 00:21:11.707 "name": null, 00:21:11.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.707 "is_configured": false, 00:21:11.707 "data_offset": 2048, 00:21:11.707 "data_size": 63488 00:21:11.707 }, 00:21:11.707 { 00:21:11.707 "name": "BaseBdev2", 00:21:11.707 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:11.707 "is_configured": true, 00:21:11.707 "data_offset": 2048, 00:21:11.707 "data_size": 63488 00:21:11.707 }, 00:21:11.707 { 00:21:11.707 "name": "BaseBdev3", 00:21:11.707 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:11.707 "is_configured": true, 00:21:11.707 "data_offset": 2048, 00:21:11.707 "data_size": 63488 00:21:11.707 }, 00:21:11.707 { 00:21:11.707 "name": "BaseBdev4", 00:21:11.707 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:11.707 "is_configured": true, 00:21:11.707 "data_offset": 2048, 00:21:11.707 "data_size": 63488 00:21:11.707 } 00:21:11.707 ] 00:21:11.707 }' 00:21:11.707 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:11.707 04:21:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:12.280 04:22:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:12.543 [2024-05-15 04:22:00.322138] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:12.543 04:22:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@647 -- # sleep 1 00:21:12.543 [2024-05-15 04:22:00.382308] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a66ba0 00:21:12.543 [2024-05-15 04:22:00.384497] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:12.543 [2024-05-15 04:22:00.502324] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:12.543 [2024-05-15 04:22:00.503780] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:12.801 [2024-05-15 04:22:00.753835] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:12.801 [2024-05-15 04:22:00.754688] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:13.367 [2024-05-15 04:22:01.109950] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:13.367 [2024-05-15 04:22:01.256515] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:13.367 [2024-05-15 04:22:01.257339] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.367 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:13.625 [2024-05-15 04:22:01.610481] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:13.883 "name": "raid_bdev1", 00:21:13.883 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:13.883 "strip_size_kb": 0, 00:21:13.883 "state": "online", 00:21:13.883 "raid_level": "raid1", 00:21:13.883 "superblock": true, 00:21:13.883 "num_base_bdevs": 4, 00:21:13.883 "num_base_bdevs_discovered": 4, 00:21:13.883 "num_base_bdevs_operational": 4, 00:21:13.883 "process": { 00:21:13.883 "type": "rebuild", 00:21:13.883 "target": "spare", 00:21:13.883 "progress": { 00:21:13.883 "blocks": 14336, 00:21:13.883 "percent": 22 00:21:13.883 } 00:21:13.883 }, 00:21:13.883 "base_bdevs_list": [ 00:21:13.883 { 00:21:13.883 "name": "spare", 00:21:13.883 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:13.883 "is_configured": true, 00:21:13.883 "data_offset": 2048, 00:21:13.883 "data_size": 63488 00:21:13.883 }, 00:21:13.883 { 00:21:13.883 "name": "BaseBdev2", 00:21:13.883 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:13.883 "is_configured": true, 00:21:13.883 "data_offset": 2048, 00:21:13.883 "data_size": 63488 00:21:13.883 }, 00:21:13.883 { 00:21:13.883 "name": "BaseBdev3", 00:21:13.883 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:13.883 "is_configured": true, 00:21:13.883 "data_offset": 2048, 00:21:13.883 "data_size": 63488 00:21:13.883 }, 00:21:13.883 { 00:21:13.883 "name": "BaseBdev4", 00:21:13.883 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:13.883 "is_configured": true, 00:21:13.883 "data_offset": 2048, 00:21:13.883 "data_size": 63488 00:21:13.883 } 00:21:13.883 ] 00:21:13.883 }' 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:13.883 04:22:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:13.883 [2024-05-15 04:22:01.834358] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:14.140 [2024-05-15 04:22:01.976538] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:14.140 [2024-05-15 04:22:02.083573] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:14.140 [2024-05-15 04:22:02.093978] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:14.140 [2024-05-15 04:22:02.121166] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c164d0 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.140 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:14.705 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:14.705 "name": "raid_bdev1", 00:21:14.705 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:14.705 "strip_size_kb": 0, 00:21:14.705 "state": "online", 00:21:14.705 "raid_level": "raid1", 00:21:14.705 "superblock": true, 00:21:14.705 "num_base_bdevs": 4, 00:21:14.705 "num_base_bdevs_discovered": 3, 00:21:14.705 "num_base_bdevs_operational": 3, 00:21:14.705 "base_bdevs_list": [ 00:21:14.705 { 00:21:14.705 "name": null, 00:21:14.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:14.705 "is_configured": false, 00:21:14.705 "data_offset": 2048, 00:21:14.705 "data_size": 63488 00:21:14.705 }, 00:21:14.705 { 00:21:14.705 "name": "BaseBdev2", 00:21:14.705 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:14.705 "is_configured": true, 00:21:14.705 "data_offset": 2048, 00:21:14.705 "data_size": 63488 00:21:14.705 }, 00:21:14.705 { 00:21:14.705 "name": "BaseBdev3", 00:21:14.705 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:14.705 "is_configured": true, 00:21:14.705 "data_offset": 2048, 00:21:14.705 "data_size": 63488 00:21:14.705 }, 00:21:14.705 { 00:21:14.705 "name": "BaseBdev4", 00:21:14.705 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:14.705 "is_configured": true, 00:21:14.705 "data_offset": 2048, 00:21:14.705 "data_size": 63488 00:21:14.705 } 00:21:14.705 ] 00:21:14.705 }' 00:21:14.705 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:14.705 04:22:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.270 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.531 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:15.531 "name": "raid_bdev1", 00:21:15.531 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:15.531 "strip_size_kb": 0, 00:21:15.531 "state": "online", 00:21:15.531 "raid_level": "raid1", 00:21:15.531 "superblock": true, 00:21:15.531 "num_base_bdevs": 4, 00:21:15.531 "num_base_bdevs_discovered": 3, 00:21:15.531 "num_base_bdevs_operational": 3, 00:21:15.531 "base_bdevs_list": [ 00:21:15.531 { 00:21:15.531 "name": null, 00:21:15.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.531 "is_configured": false, 00:21:15.531 "data_offset": 2048, 00:21:15.531 "data_size": 63488 00:21:15.531 }, 00:21:15.531 { 00:21:15.531 "name": "BaseBdev2", 00:21:15.531 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:15.531 "is_configured": true, 00:21:15.531 "data_offset": 2048, 00:21:15.531 "data_size": 63488 00:21:15.531 }, 00:21:15.531 { 00:21:15.531 "name": "BaseBdev3", 00:21:15.531 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:15.531 "is_configured": true, 00:21:15.531 "data_offset": 2048, 00:21:15.531 "data_size": 63488 00:21:15.531 }, 00:21:15.532 { 00:21:15.532 "name": "BaseBdev4", 00:21:15.532 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:15.532 "is_configured": true, 00:21:15.532 "data_offset": 2048, 00:21:15.532 "data_size": 63488 00:21:15.532 } 00:21:15.532 ] 00:21:15.532 }' 00:21:15.532 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:15.532 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:15.532 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:15.532 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:15.532 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:15.792 [2024-05-15 04:22:03.732592] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:15.792 04:22:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # sleep 1 00:21:15.792 [2024-05-15 04:22:03.772964] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c16770 00:21:15.792 [2024-05-15 04:22:03.774457] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:16.050 [2024-05-15 04:22:03.884080] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:16.050 [2024-05-15 04:22:03.884613] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:21:16.050 [2024-05-15 04:22:04.011450] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:16.050 [2024-05-15 04:22:04.012222] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:21:16.616 [2024-05-15 04:22:04.367367] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:16.616 [2024-05-15 04:22:04.368818] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:21:16.616 [2024-05-15 04:22:04.581471] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:16.616 [2024-05-15 04:22:04.582277] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.873 04:22:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.130 [2024-05-15 04:22:04.918221] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:21:17.130 [2024-05-15 04:22:05.046852] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:21:17.130 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:17.130 "name": "raid_bdev1", 00:21:17.130 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:17.130 "strip_size_kb": 0, 00:21:17.130 "state": "online", 00:21:17.130 "raid_level": "raid1", 00:21:17.130 "superblock": true, 00:21:17.130 "num_base_bdevs": 4, 00:21:17.130 "num_base_bdevs_discovered": 4, 00:21:17.130 "num_base_bdevs_operational": 4, 00:21:17.130 "process": { 00:21:17.130 "type": "rebuild", 00:21:17.130 "target": "spare", 00:21:17.130 "progress": { 00:21:17.130 "blocks": 14336, 00:21:17.130 "percent": 22 00:21:17.130 } 00:21:17.130 }, 00:21:17.130 "base_bdevs_list": [ 00:21:17.130 { 00:21:17.130 "name": "spare", 00:21:17.130 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:17.130 "is_configured": true, 00:21:17.130 "data_offset": 2048, 00:21:17.130 "data_size": 63488 00:21:17.130 }, 00:21:17.130 { 00:21:17.130 "name": "BaseBdev2", 00:21:17.130 "uuid": "118fefa4-d44d-5080-96a4-f2a1cc453ff5", 00:21:17.130 "is_configured": true, 00:21:17.130 "data_offset": 2048, 00:21:17.130 "data_size": 63488 00:21:17.130 }, 00:21:17.130 { 00:21:17.130 "name": "BaseBdev3", 00:21:17.130 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:17.130 "is_configured": true, 00:21:17.130 "data_offset": 2048, 00:21:17.130 "data_size": 63488 00:21:17.130 }, 00:21:17.130 { 00:21:17.130 "name": "BaseBdev4", 00:21:17.130 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:17.130 "is_configured": true, 00:21:17.130 "data_offset": 2048, 00:21:17.130 "data_size": 63488 00:21:17.130 } 00:21:17.130 ] 00:21:17.130 }' 00:21:17.130 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:17.130 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:17.130 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:21:17.387 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=4 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@693 -- # '[' 4 -gt 2 ']' 00:21:17.387 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@695 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:17.387 [2024-05-15 04:22:05.372450] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:21:17.387 [2024-05-15 04:22:05.394247] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:17.647 [2024-05-15 04:22:05.576594] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:17.647 [2024-05-15 04:22:05.576807] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:17.905 [2024-05-15 04:22:05.686673] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1c164d0 00:21:17.905 [2024-05-15 04:22:05.686700] bdev_raid.c:1969:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1c16770 00:21:17.905 [2024-05-15 04:22:05.688047] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # base_bdevs[1]= 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@699 -- # (( num_base_bdevs_operational-- )) 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@702 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.905 04:22:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.164 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:18.164 "name": "raid_bdev1", 00:21:18.164 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:18.164 "strip_size_kb": 0, 00:21:18.164 "state": "online", 00:21:18.164 "raid_level": "raid1", 00:21:18.164 "superblock": true, 00:21:18.164 "num_base_bdevs": 4, 00:21:18.164 "num_base_bdevs_discovered": 3, 00:21:18.164 "num_base_bdevs_operational": 3, 00:21:18.164 "process": { 00:21:18.164 "type": "rebuild", 00:21:18.164 "target": "spare", 00:21:18.164 "progress": { 00:21:18.164 "blocks": 26624, 00:21:18.164 "percent": 41 00:21:18.164 } 00:21:18.164 }, 00:21:18.164 "base_bdevs_list": [ 00:21:18.164 { 00:21:18.164 "name": "spare", 00:21:18.164 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:18.164 "is_configured": true, 00:21:18.164 "data_offset": 2048, 00:21:18.164 "data_size": 63488 00:21:18.164 }, 00:21:18.164 { 00:21:18.164 "name": null, 00:21:18.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.164 "is_configured": false, 00:21:18.164 "data_offset": 2048, 00:21:18.164 "data_size": 63488 00:21:18.164 }, 00:21:18.164 { 00:21:18.164 "name": "BaseBdev3", 00:21:18.164 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:18.164 "is_configured": true, 00:21:18.164 "data_offset": 2048, 00:21:18.164 "data_size": 63488 00:21:18.164 }, 00:21:18.164 { 00:21:18.164 "name": "BaseBdev4", 00:21:18.164 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:18.164 "is_configured": true, 00:21:18.164 "data_offset": 2048, 00:21:18.164 "data_size": 63488 00:21:18.164 } 00:21:18.164 ] 00:21:18.164 }' 00:21:18.164 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:18.164 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.164 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local timeout=802 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:18.452 [2024-05-15 04:22:06.323385] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:18.452 "name": "raid_bdev1", 00:21:18.452 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:18.452 "strip_size_kb": 0, 00:21:18.452 "state": "online", 00:21:18.452 "raid_level": "raid1", 00:21:18.452 "superblock": true, 00:21:18.452 "num_base_bdevs": 4, 00:21:18.452 "num_base_bdevs_discovered": 3, 00:21:18.452 "num_base_bdevs_operational": 3, 00:21:18.452 "process": { 00:21:18.452 "type": "rebuild", 00:21:18.452 "target": "spare", 00:21:18.452 "progress": { 00:21:18.452 "blocks": 32768, 00:21:18.452 "percent": 51 00:21:18.452 } 00:21:18.452 }, 00:21:18.452 "base_bdevs_list": [ 00:21:18.452 { 00:21:18.452 "name": "spare", 00:21:18.452 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": null, 00:21:18.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:18.452 "is_configured": false, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": "BaseBdev3", 00:21:18.452 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 }, 00:21:18.452 { 00:21:18.452 "name": "BaseBdev4", 00:21:18.452 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:18.452 "is_configured": true, 00:21:18.452 "data_offset": 2048, 00:21:18.452 "data_size": 63488 00:21:18.452 } 00:21:18.452 ] 00:21:18.452 }' 00:21:18.452 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:18.452 [2024-05-15 04:22:06.433084] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:21:18.735 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:18.735 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:18.735 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:18.735 04:22:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:21:19.301 [2024-05-15 04:22:07.231975] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.560 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:19.818 [2024-05-15 04:22:07.659040] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:21:19.818 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:19.818 "name": "raid_bdev1", 00:21:19.818 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:19.818 "strip_size_kb": 0, 00:21:19.818 "state": "online", 00:21:19.818 "raid_level": "raid1", 00:21:19.818 "superblock": true, 00:21:19.818 "num_base_bdevs": 4, 00:21:19.818 "num_base_bdevs_discovered": 3, 00:21:19.818 "num_base_bdevs_operational": 3, 00:21:19.818 "process": { 00:21:19.818 "type": "rebuild", 00:21:19.818 "target": "spare", 00:21:19.818 "progress": { 00:21:19.818 "blocks": 53248, 00:21:19.818 "percent": 83 00:21:19.818 } 00:21:19.818 }, 00:21:19.818 "base_bdevs_list": [ 00:21:19.818 { 00:21:19.818 "name": "spare", 00:21:19.818 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:19.818 "is_configured": true, 00:21:19.818 "data_offset": 2048, 00:21:19.818 "data_size": 63488 00:21:19.818 }, 00:21:19.818 { 00:21:19.818 "name": null, 00:21:19.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.818 "is_configured": false, 00:21:19.818 "data_offset": 2048, 00:21:19.818 "data_size": 63488 00:21:19.818 }, 00:21:19.818 { 00:21:19.818 "name": "BaseBdev3", 00:21:19.818 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:19.818 "is_configured": true, 00:21:19.818 "data_offset": 2048, 00:21:19.818 "data_size": 63488 00:21:19.818 }, 00:21:19.818 { 00:21:19.818 "name": "BaseBdev4", 00:21:19.818 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:19.818 "is_configured": true, 00:21:19.818 "data_offset": 2048, 00:21:19.818 "data_size": 63488 00:21:19.818 } 00:21:19.818 ] 00:21:19.818 }' 00:21:19.818 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:19.818 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:19.818 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:20.076 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:20.076 04:22:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@711 -- # sleep 1 00:21:20.076 [2024-05-15 04:22:08.002049] bdev_raid.c: 857:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:21:20.335 [2024-05-15 04:22:08.342060] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:20.593 [2024-05-15 04:22:08.442385] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:20.593 [2024-05-15 04:22:08.443765] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.851 04:22:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.109 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:21.109 "name": "raid_bdev1", 00:21:21.109 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:21.109 "strip_size_kb": 0, 00:21:21.109 "state": "online", 00:21:21.109 "raid_level": "raid1", 00:21:21.109 "superblock": true, 00:21:21.109 "num_base_bdevs": 4, 00:21:21.109 "num_base_bdevs_discovered": 3, 00:21:21.109 "num_base_bdevs_operational": 3, 00:21:21.109 "base_bdevs_list": [ 00:21:21.109 { 00:21:21.109 "name": "spare", 00:21:21.109 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:21.109 "is_configured": true, 00:21:21.109 "data_offset": 2048, 00:21:21.109 "data_size": 63488 00:21:21.109 }, 00:21:21.109 { 00:21:21.109 "name": null, 00:21:21.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.109 "is_configured": false, 00:21:21.109 "data_offset": 2048, 00:21:21.109 "data_size": 63488 00:21:21.109 }, 00:21:21.109 { 00:21:21.109 "name": "BaseBdev3", 00:21:21.109 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:21.109 "is_configured": true, 00:21:21.109 "data_offset": 2048, 00:21:21.109 "data_size": 63488 00:21:21.109 }, 00:21:21.109 { 00:21:21.109 "name": "BaseBdev4", 00:21:21.110 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:21.110 "is_configured": true, 00:21:21.110 "data_offset": 2048, 00:21:21.110 "data_size": 63488 00:21:21.110 } 00:21:21.110 ] 00:21:21.110 }' 00:21:21.110 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@709 -- # break 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.368 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:21.626 "name": "raid_bdev1", 00:21:21.626 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:21.626 "strip_size_kb": 0, 00:21:21.626 "state": "online", 00:21:21.626 "raid_level": "raid1", 00:21:21.626 "superblock": true, 00:21:21.626 "num_base_bdevs": 4, 00:21:21.626 "num_base_bdevs_discovered": 3, 00:21:21.626 "num_base_bdevs_operational": 3, 00:21:21.626 "base_bdevs_list": [ 00:21:21.626 { 00:21:21.626 "name": "spare", 00:21:21.626 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:21.626 "is_configured": true, 00:21:21.626 "data_offset": 2048, 00:21:21.626 "data_size": 63488 00:21:21.626 }, 00:21:21.626 { 00:21:21.626 "name": null, 00:21:21.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.626 "is_configured": false, 00:21:21.626 "data_offset": 2048, 00:21:21.626 "data_size": 63488 00:21:21.626 }, 00:21:21.626 { 00:21:21.626 "name": "BaseBdev3", 00:21:21.626 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:21.626 "is_configured": true, 00:21:21.626 "data_offset": 2048, 00:21:21.626 "data_size": 63488 00:21:21.626 }, 00:21:21.626 { 00:21:21.626 "name": "BaseBdev4", 00:21:21.626 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:21.626 "is_configured": true, 00:21:21.626 "data_offset": 2048, 00:21:21.626 "data_size": 63488 00:21:21.626 } 00:21:21.626 ] 00:21:21.626 }' 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.626 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:21.885 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:21.885 "name": "raid_bdev1", 00:21:21.885 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:21.885 "strip_size_kb": 0, 00:21:21.885 "state": "online", 00:21:21.885 "raid_level": "raid1", 00:21:21.885 "superblock": true, 00:21:21.885 "num_base_bdevs": 4, 00:21:21.885 "num_base_bdevs_discovered": 3, 00:21:21.885 "num_base_bdevs_operational": 3, 00:21:21.885 "base_bdevs_list": [ 00:21:21.885 { 00:21:21.885 "name": "spare", 00:21:21.885 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:21.885 "is_configured": true, 00:21:21.885 "data_offset": 2048, 00:21:21.885 "data_size": 63488 00:21:21.885 }, 00:21:21.885 { 00:21:21.885 "name": null, 00:21:21.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.885 "is_configured": false, 00:21:21.885 "data_offset": 2048, 00:21:21.885 "data_size": 63488 00:21:21.885 }, 00:21:21.885 { 00:21:21.885 "name": "BaseBdev3", 00:21:21.885 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:21.885 "is_configured": true, 00:21:21.885 "data_offset": 2048, 00:21:21.885 "data_size": 63488 00:21:21.885 }, 00:21:21.885 { 00:21:21.885 "name": "BaseBdev4", 00:21:21.885 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:21.885 "is_configured": true, 00:21:21.885 "data_offset": 2048, 00:21:21.885 "data_size": 63488 00:21:21.885 } 00:21:21.885 ] 00:21:21.885 }' 00:21:21.885 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:21.885 04:22:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:22.451 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:22.708 [2024-05-15 04:22:10.546973] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:22.708 [2024-05-15 04:22:10.547029] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:22.708 00:21:22.708 Latency(us) 00:21:22.708 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:22.708 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:21:22.708 raid_bdev1 : 11.64 97.99 293.96 0.00 0.00 13841.78 232.11 121168.78 00:21:22.708 =================================================================================================================== 00:21:22.708 Total : 97.99 293.96 0.00 0.00 13841.78 232.11 121168.78 00:21:22.708 [2024-05-15 04:22:10.651168] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.708 [2024-05-15 04:22:10.651204] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:22.708 [2024-05-15 04:22:10.651311] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:22.708 [2024-05-15 04:22:10.651329] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c13e90 name raid_bdev1, state offline 00:21:22.708 0 00:21:22.708 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.708 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # jq length 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # '[' true = true ']' 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:22.965 04:22:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:21:23.222 /dev/nbd0 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:23.222 1+0 records in 00:21:23.222 1+0 records out 00:21:23.222 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000166505 s, 24.6 MB/s 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' -z '' ']' 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@728 -- # continue 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev3 ']' 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.222 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:21:23.480 /dev/nbd1 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:23.480 1+0 records in 00:21:23.480 1+0 records out 00:21:23.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187554 s, 21.8 MB/s 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:23.480 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.481 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:23.738 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # for bdev in "${base_bdevs[@]:1}" 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # '[' -z BaseBdev4 ']' 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:23.997 04:22:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:21:24.255 /dev/nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@865 -- # local i 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # break 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:24.255 1+0 records in 00:21:24.255 1+0 records out 00:21:24.255 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213396 s, 19.2 MB/s 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # size=4096 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # return 0 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@732 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:24.255 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:24.513 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:24.513 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:24.513 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:24.513 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:24.513 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:24.514 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:21:24.772 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:25.029 04:22:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:25.286 [2024-05-15 04:22:13.170718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:25.286 [2024-05-15 04:22:13.170782] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:25.286 [2024-05-15 04:22:13.170851] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a65c10 00:21:25.286 [2024-05-15 04:22:13.170903] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:25.286 [2024-05-15 04:22:13.172474] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:25.286 [2024-05-15 04:22:13.172498] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:25.286 [2024-05-15 04:22:13.172595] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:25.286 [2024-05-15 04:22:13.172631] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:25.286 [2024-05-15 04:22:13.172748] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:25.286 [2024-05-15 04:22:13.172835] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:25.286 spare 00:21:25.286 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:25.286 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:25.286 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:25.286 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:25.286 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=3 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.287 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:25.287 [2024-05-15 04:22:13.273179] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c16740 00:21:25.287 [2024-05-15 04:22:13.273197] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:25.287 [2024-05-15 04:22:13.273362] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6bcd0 00:21:25.287 [2024-05-15 04:22:13.273535] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c16740 00:21:25.287 [2024-05-15 04:22:13.273549] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c16740 00:21:25.287 [2024-05-15 04:22:13.273653] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.544 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:25.544 "name": "raid_bdev1", 00:21:25.544 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:25.544 "strip_size_kb": 0, 00:21:25.544 "state": "online", 00:21:25.544 "raid_level": "raid1", 00:21:25.544 "superblock": true, 00:21:25.544 "num_base_bdevs": 4, 00:21:25.544 "num_base_bdevs_discovered": 3, 00:21:25.544 "num_base_bdevs_operational": 3, 00:21:25.544 "base_bdevs_list": [ 00:21:25.544 { 00:21:25.544 "name": "spare", 00:21:25.544 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:25.544 "is_configured": true, 00:21:25.544 "data_offset": 2048, 00:21:25.544 "data_size": 63488 00:21:25.544 }, 00:21:25.544 { 00:21:25.544 "name": null, 00:21:25.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:25.544 "is_configured": false, 00:21:25.544 "data_offset": 2048, 00:21:25.544 "data_size": 63488 00:21:25.544 }, 00:21:25.544 { 00:21:25.544 "name": "BaseBdev3", 00:21:25.544 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:25.544 "is_configured": true, 00:21:25.544 "data_offset": 2048, 00:21:25.544 "data_size": 63488 00:21:25.544 }, 00:21:25.544 { 00:21:25.544 "name": "BaseBdev4", 00:21:25.544 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:25.544 "is_configured": true, 00:21:25.544 "data_offset": 2048, 00:21:25.544 "data_size": 63488 00:21:25.544 } 00:21:25.544 ] 00:21:25.544 }' 00:21:25.544 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:25.544 04:22:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.110 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:26.369 "name": "raid_bdev1", 00:21:26.369 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:26.369 "strip_size_kb": 0, 00:21:26.369 "state": "online", 00:21:26.369 "raid_level": "raid1", 00:21:26.369 "superblock": true, 00:21:26.369 "num_base_bdevs": 4, 00:21:26.369 "num_base_bdevs_discovered": 3, 00:21:26.369 "num_base_bdevs_operational": 3, 00:21:26.369 "base_bdevs_list": [ 00:21:26.369 { 00:21:26.369 "name": "spare", 00:21:26.369 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:26.369 "is_configured": true, 00:21:26.369 "data_offset": 2048, 00:21:26.369 "data_size": 63488 00:21:26.369 }, 00:21:26.369 { 00:21:26.369 "name": null, 00:21:26.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.369 "is_configured": false, 00:21:26.369 "data_offset": 2048, 00:21:26.369 "data_size": 63488 00:21:26.369 }, 00:21:26.369 { 00:21:26.369 "name": "BaseBdev3", 00:21:26.369 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:26.369 "is_configured": true, 00:21:26.369 "data_offset": 2048, 00:21:26.369 "data_size": 63488 00:21:26.369 }, 00:21:26.369 { 00:21:26.369 "name": "BaseBdev4", 00:21:26.369 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:26.369 "is_configured": true, 00:21:26.369 "data_offset": 2048, 00:21:26.369 "data_size": 63488 00:21:26.369 } 00:21:26.369 ] 00:21:26.369 }' 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.369 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:21:26.627 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:21:26.627 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:26.885 [2024-05-15 04:22:14.839517] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:26.885 04:22:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:27.142 04:22:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:27.142 "name": "raid_bdev1", 00:21:27.142 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:27.142 "strip_size_kb": 0, 00:21:27.142 "state": "online", 00:21:27.142 "raid_level": "raid1", 00:21:27.142 "superblock": true, 00:21:27.142 "num_base_bdevs": 4, 00:21:27.142 "num_base_bdevs_discovered": 2, 00:21:27.142 "num_base_bdevs_operational": 2, 00:21:27.142 "base_bdevs_list": [ 00:21:27.142 { 00:21:27.142 "name": null, 00:21:27.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.142 "is_configured": false, 00:21:27.142 "data_offset": 2048, 00:21:27.142 "data_size": 63488 00:21:27.142 }, 00:21:27.142 { 00:21:27.142 "name": null, 00:21:27.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.142 "is_configured": false, 00:21:27.142 "data_offset": 2048, 00:21:27.142 "data_size": 63488 00:21:27.142 }, 00:21:27.142 { 00:21:27.142 "name": "BaseBdev3", 00:21:27.142 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:27.142 "is_configured": true, 00:21:27.142 "data_offset": 2048, 00:21:27.142 "data_size": 63488 00:21:27.142 }, 00:21:27.142 { 00:21:27.142 "name": "BaseBdev4", 00:21:27.142 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:27.142 "is_configured": true, 00:21:27.142 "data_offset": 2048, 00:21:27.142 "data_size": 63488 00:21:27.142 } 00:21:27.142 ] 00:21:27.142 }' 00:21:27.142 04:22:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:27.142 04:22:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:27.707 04:22:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:27.965 [2024-05-15 04:22:15.870394] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:27.965 [2024-05-15 04:22:15.870603] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:27.965 [2024-05-15 04:22:15.870626] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:27.965 [2024-05-15 04:22:15.870660] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:27.965 [2024-05-15 04:22:15.876341] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1773b90 00:21:27.965 [2024-05-15 04:22:15.878532] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:27.965 04:22:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # sleep 1 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.899 04:22:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.156 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:29.156 "name": "raid_bdev1", 00:21:29.156 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:29.156 "strip_size_kb": 0, 00:21:29.156 "state": "online", 00:21:29.156 "raid_level": "raid1", 00:21:29.156 "superblock": true, 00:21:29.156 "num_base_bdevs": 4, 00:21:29.156 "num_base_bdevs_discovered": 3, 00:21:29.156 "num_base_bdevs_operational": 3, 00:21:29.156 "process": { 00:21:29.156 "type": "rebuild", 00:21:29.156 "target": "spare", 00:21:29.156 "progress": { 00:21:29.156 "blocks": 24576, 00:21:29.156 "percent": 38 00:21:29.156 } 00:21:29.156 }, 00:21:29.156 "base_bdevs_list": [ 00:21:29.156 { 00:21:29.156 "name": "spare", 00:21:29.156 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:29.156 "is_configured": true, 00:21:29.156 "data_offset": 2048, 00:21:29.156 "data_size": 63488 00:21:29.156 }, 00:21:29.156 { 00:21:29.156 "name": null, 00:21:29.156 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.156 "is_configured": false, 00:21:29.156 "data_offset": 2048, 00:21:29.156 "data_size": 63488 00:21:29.156 }, 00:21:29.156 { 00:21:29.156 "name": "BaseBdev3", 00:21:29.156 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:29.156 "is_configured": true, 00:21:29.156 "data_offset": 2048, 00:21:29.156 "data_size": 63488 00:21:29.156 }, 00:21:29.156 { 00:21:29.156 "name": "BaseBdev4", 00:21:29.156 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:29.156 "is_configured": true, 00:21:29.156 "data_offset": 2048, 00:21:29.156 "data_size": 63488 00:21:29.156 } 00:21:29.156 ] 00:21:29.156 }' 00:21:29.156 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:29.426 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:29.426 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:29.426 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:29.426 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:29.688 [2024-05-15 04:22:17.480921] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:29.688 [2024-05-15 04:22:17.492026] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:29.688 [2024-05-15 04:22:17.492096] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.688 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.946 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:29.946 "name": "raid_bdev1", 00:21:29.946 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:29.946 "strip_size_kb": 0, 00:21:29.946 "state": "online", 00:21:29.946 "raid_level": "raid1", 00:21:29.946 "superblock": true, 00:21:29.946 "num_base_bdevs": 4, 00:21:29.946 "num_base_bdevs_discovered": 2, 00:21:29.946 "num_base_bdevs_operational": 2, 00:21:29.946 "base_bdevs_list": [ 00:21:29.946 { 00:21:29.946 "name": null, 00:21:29.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.946 "is_configured": false, 00:21:29.946 "data_offset": 2048, 00:21:29.946 "data_size": 63488 00:21:29.946 }, 00:21:29.946 { 00:21:29.946 "name": null, 00:21:29.946 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.946 "is_configured": false, 00:21:29.946 "data_offset": 2048, 00:21:29.946 "data_size": 63488 00:21:29.946 }, 00:21:29.946 { 00:21:29.946 "name": "BaseBdev3", 00:21:29.946 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:29.946 "is_configured": true, 00:21:29.946 "data_offset": 2048, 00:21:29.946 "data_size": 63488 00:21:29.946 }, 00:21:29.946 { 00:21:29.946 "name": "BaseBdev4", 00:21:29.946 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:29.946 "is_configured": true, 00:21:29.946 "data_offset": 2048, 00:21:29.946 "data_size": 63488 00:21:29.946 } 00:21:29.946 ] 00:21:29.946 }' 00:21:29.946 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:29.946 04:22:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:30.510 04:22:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:30.769 [2024-05-15 04:22:18.580978] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:30.769 [2024-05-15 04:22:18.581053] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.769 [2024-05-15 04:22:18.581077] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a68a40 00:21:30.769 [2024-05-15 04:22:18.581091] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.769 [2024-05-15 04:22:18.581502] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.769 [2024-05-15 04:22:18.581525] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:30.769 [2024-05-15 04:22:18.581615] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:21:30.769 [2024-05-15 04:22:18.581632] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:21:30.769 [2024-05-15 04:22:18.581642] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:21:30.769 [2024-05-15 04:22:18.581664] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:30.769 [2024-05-15 04:22:18.586833] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b01060 00:21:30.769 spare 00:21:30.769 [2024-05-15 04:22:18.588193] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:30.769 04:22:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # sleep 1 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=spare 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.702 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:31.960 "name": "raid_bdev1", 00:21:31.960 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:31.960 "strip_size_kb": 0, 00:21:31.960 "state": "online", 00:21:31.960 "raid_level": "raid1", 00:21:31.960 "superblock": true, 00:21:31.960 "num_base_bdevs": 4, 00:21:31.960 "num_base_bdevs_discovered": 3, 00:21:31.960 "num_base_bdevs_operational": 3, 00:21:31.960 "process": { 00:21:31.960 "type": "rebuild", 00:21:31.960 "target": "spare", 00:21:31.960 "progress": { 00:21:31.960 "blocks": 24576, 00:21:31.960 "percent": 38 00:21:31.960 } 00:21:31.960 }, 00:21:31.960 "base_bdevs_list": [ 00:21:31.960 { 00:21:31.960 "name": "spare", 00:21:31.960 "uuid": "9145fd4d-ced7-57d3-bfb5-f1169503eab6", 00:21:31.960 "is_configured": true, 00:21:31.960 "data_offset": 2048, 00:21:31.960 "data_size": 63488 00:21:31.960 }, 00:21:31.960 { 00:21:31.960 "name": null, 00:21:31.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.960 "is_configured": false, 00:21:31.960 "data_offset": 2048, 00:21:31.960 "data_size": 63488 00:21:31.960 }, 00:21:31.960 { 00:21:31.960 "name": "BaseBdev3", 00:21:31.960 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:31.960 "is_configured": true, 00:21:31.960 "data_offset": 2048, 00:21:31.960 "data_size": 63488 00:21:31.960 }, 00:21:31.960 { 00:21:31.960 "name": "BaseBdev4", 00:21:31.960 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:31.960 "is_configured": true, 00:21:31.960 "data_offset": 2048, 00:21:31.960 "data_size": 63488 00:21:31.960 } 00:21:31.960 ] 00:21:31.960 }' 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:21:31.960 04:22:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:21:32.218 [2024-05-15 04:22:20.159433] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:32.218 [2024-05-15 04:22:20.201614] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:32.218 [2024-05-15 04:22:20.201681] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:32.218 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:32.477 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:32.477 "name": "raid_bdev1", 00:21:32.477 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:32.477 "strip_size_kb": 0, 00:21:32.477 "state": "online", 00:21:32.477 "raid_level": "raid1", 00:21:32.477 "superblock": true, 00:21:32.477 "num_base_bdevs": 4, 00:21:32.477 "num_base_bdevs_discovered": 2, 00:21:32.477 "num_base_bdevs_operational": 2, 00:21:32.477 "base_bdevs_list": [ 00:21:32.477 { 00:21:32.477 "name": null, 00:21:32.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.477 "is_configured": false, 00:21:32.477 "data_offset": 2048, 00:21:32.477 "data_size": 63488 00:21:32.477 }, 00:21:32.477 { 00:21:32.477 "name": null, 00:21:32.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:32.477 "is_configured": false, 00:21:32.477 "data_offset": 2048, 00:21:32.477 "data_size": 63488 00:21:32.477 }, 00:21:32.477 { 00:21:32.477 "name": "BaseBdev3", 00:21:32.477 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:32.477 "is_configured": true, 00:21:32.477 "data_offset": 2048, 00:21:32.477 "data_size": 63488 00:21:32.477 }, 00:21:32.477 { 00:21:32.477 "name": "BaseBdev4", 00:21:32.477 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:32.477 "is_configured": true, 00:21:32.477 "data_offset": 2048, 00:21:32.477 "data_size": 63488 00:21:32.477 } 00:21:32.477 ] 00:21:32.477 }' 00:21:32.477 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:32.477 04:22:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.043 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:33.301 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:33.301 "name": "raid_bdev1", 00:21:33.301 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:33.301 "strip_size_kb": 0, 00:21:33.301 "state": "online", 00:21:33.301 "raid_level": "raid1", 00:21:33.301 "superblock": true, 00:21:33.301 "num_base_bdevs": 4, 00:21:33.301 "num_base_bdevs_discovered": 2, 00:21:33.301 "num_base_bdevs_operational": 2, 00:21:33.301 "base_bdevs_list": [ 00:21:33.301 { 00:21:33.301 "name": null, 00:21:33.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.301 "is_configured": false, 00:21:33.301 "data_offset": 2048, 00:21:33.301 "data_size": 63488 00:21:33.301 }, 00:21:33.301 { 00:21:33.301 "name": null, 00:21:33.301 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.301 "is_configured": false, 00:21:33.301 "data_offset": 2048, 00:21:33.301 "data_size": 63488 00:21:33.301 }, 00:21:33.301 { 00:21:33.301 "name": "BaseBdev3", 00:21:33.301 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:33.301 "is_configured": true, 00:21:33.301 "data_offset": 2048, 00:21:33.301 "data_size": 63488 00:21:33.301 }, 00:21:33.301 { 00:21:33.301 "name": "BaseBdev4", 00:21:33.301 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:33.301 "is_configured": true, 00:21:33.301 "data_offset": 2048, 00:21:33.301 "data_size": 63488 00:21:33.301 } 00:21:33.301 ] 00:21:33.301 }' 00:21:33.301 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:33.559 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:33.559 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:33.559 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:33.559 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:21:33.818 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:34.076 [2024-05-15 04:22:21.888277] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:34.076 [2024-05-15 04:22:21.888344] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:34.076 [2024-05-15 04:22:21.888374] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a68700 00:21:34.076 [2024-05-15 04:22:21.888387] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:34.076 [2024-05-15 04:22:21.888783] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:34.076 [2024-05-15 04:22:21.888821] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:34.076 [2024-05-15 04:22:21.888926] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:21:34.076 [2024-05-15 04:22:21.888944] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:34.076 [2024-05-15 04:22:21.888954] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:34.076 BaseBdev1 00:21:34.076 04:22:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # sleep 1 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.010 04:22:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:35.268 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:35.268 "name": "raid_bdev1", 00:21:35.268 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:35.268 "strip_size_kb": 0, 00:21:35.268 "state": "online", 00:21:35.268 "raid_level": "raid1", 00:21:35.268 "superblock": true, 00:21:35.268 "num_base_bdevs": 4, 00:21:35.268 "num_base_bdevs_discovered": 2, 00:21:35.268 "num_base_bdevs_operational": 2, 00:21:35.268 "base_bdevs_list": [ 00:21:35.268 { 00:21:35.268 "name": null, 00:21:35.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.268 "is_configured": false, 00:21:35.268 "data_offset": 2048, 00:21:35.268 "data_size": 63488 00:21:35.268 }, 00:21:35.268 { 00:21:35.268 "name": null, 00:21:35.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.268 "is_configured": false, 00:21:35.268 "data_offset": 2048, 00:21:35.268 "data_size": 63488 00:21:35.268 }, 00:21:35.268 { 00:21:35.268 "name": "BaseBdev3", 00:21:35.268 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:35.268 "is_configured": true, 00:21:35.268 "data_offset": 2048, 00:21:35.268 "data_size": 63488 00:21:35.268 }, 00:21:35.268 { 00:21:35.268 "name": "BaseBdev4", 00:21:35.268 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:35.268 "is_configured": true, 00:21:35.268 "data_offset": 2048, 00:21:35.268 "data_size": 63488 00:21:35.268 } 00:21:35.268 ] 00:21:35.268 }' 00:21:35.268 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:35.268 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.834 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.092 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:36.092 "name": "raid_bdev1", 00:21:36.092 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:36.092 "strip_size_kb": 0, 00:21:36.092 "state": "online", 00:21:36.092 "raid_level": "raid1", 00:21:36.092 "superblock": true, 00:21:36.092 "num_base_bdevs": 4, 00:21:36.092 "num_base_bdevs_discovered": 2, 00:21:36.092 "num_base_bdevs_operational": 2, 00:21:36.092 "base_bdevs_list": [ 00:21:36.092 { 00:21:36.092 "name": null, 00:21:36.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.092 "is_configured": false, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 }, 00:21:36.092 { 00:21:36.092 "name": null, 00:21:36.092 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.092 "is_configured": false, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 }, 00:21:36.092 { 00:21:36.092 "name": "BaseBdev3", 00:21:36.092 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:36.092 "is_configured": true, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 }, 00:21:36.092 { 00:21:36.092 "name": "BaseBdev4", 00:21:36.092 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:36.092 "is_configured": true, 00:21:36.092 "data_offset": 2048, 00:21:36.092 "data_size": 63488 00:21:36.092 } 00:21:36.092 ] 00:21:36.092 }' 00:21:36.092 04:22:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:36.092 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:21:36.350 [2024-05-15 04:22:24.331036] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:36.350 [2024-05-15 04:22:24.331211] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:21:36.350 [2024-05-15 04:22:24.331232] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:21:36.350 request: 00:21:36.350 { 00:21:36.350 "raid_bdev": "raid_bdev1", 00:21:36.350 "base_bdev": "BaseBdev1", 00:21:36.350 "method": "bdev_raid_add_base_bdev", 00:21:36.350 "req_id": 1 00:21:36.350 } 00:21:36.350 Got JSON-RPC error response 00:21:36.350 response: 00:21:36.350 { 00:21:36.350 "code": -22, 00:21:36.350 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:21:36.350 } 00:21:36.350 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:21:36.350 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:36.350 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:36.350 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:36.350 04:22:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:37.723 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:37.723 "name": "raid_bdev1", 00:21:37.723 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:37.723 "strip_size_kb": 0, 00:21:37.723 "state": "online", 00:21:37.723 "raid_level": "raid1", 00:21:37.723 "superblock": true, 00:21:37.723 "num_base_bdevs": 4, 00:21:37.723 "num_base_bdevs_discovered": 2, 00:21:37.724 "num_base_bdevs_operational": 2, 00:21:37.724 "base_bdevs_list": [ 00:21:37.724 { 00:21:37.724 "name": null, 00:21:37.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.724 "is_configured": false, 00:21:37.724 "data_offset": 2048, 00:21:37.724 "data_size": 63488 00:21:37.724 }, 00:21:37.724 { 00:21:37.724 "name": null, 00:21:37.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:37.724 "is_configured": false, 00:21:37.724 "data_offset": 2048, 00:21:37.724 "data_size": 63488 00:21:37.724 }, 00:21:37.724 { 00:21:37.724 "name": "BaseBdev3", 00:21:37.724 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:37.724 "is_configured": true, 00:21:37.724 "data_offset": 2048, 00:21:37.724 "data_size": 63488 00:21:37.724 }, 00:21:37.724 { 00:21:37.724 "name": "BaseBdev4", 00:21:37.724 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:37.724 "is_configured": true, 00:21:37.724 "data_offset": 2048, 00:21:37.724 "data_size": 63488 00:21:37.724 } 00:21:37.724 ] 00:21:37.724 }' 00:21:37.724 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:37.724 04:22:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local target=none 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.293 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:21:38.621 "name": "raid_bdev1", 00:21:38.621 "uuid": "f7627ee0-666d-49ad-8be9-79a6ce34d8cf", 00:21:38.621 "strip_size_kb": 0, 00:21:38.621 "state": "online", 00:21:38.621 "raid_level": "raid1", 00:21:38.621 "superblock": true, 00:21:38.621 "num_base_bdevs": 4, 00:21:38.621 "num_base_bdevs_discovered": 2, 00:21:38.621 "num_base_bdevs_operational": 2, 00:21:38.621 "base_bdevs_list": [ 00:21:38.621 { 00:21:38.621 "name": null, 00:21:38.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.621 "is_configured": false, 00:21:38.621 "data_offset": 2048, 00:21:38.621 "data_size": 63488 00:21:38.621 }, 00:21:38.621 { 00:21:38.621 "name": null, 00:21:38.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.621 "is_configured": false, 00:21:38.621 "data_offset": 2048, 00:21:38.621 "data_size": 63488 00:21:38.621 }, 00:21:38.621 { 00:21:38.621 "name": "BaseBdev3", 00:21:38.621 "uuid": "abdb8432-f958-5456-848a-730ff985c282", 00:21:38.621 "is_configured": true, 00:21:38.621 "data_offset": 2048, 00:21:38.621 "data_size": 63488 00:21:38.621 }, 00:21:38.621 { 00:21:38.621 "name": "BaseBdev4", 00:21:38.621 "uuid": "49ced9b2-7f17-55b8-90ab-7fc48d183f1e", 00:21:38.621 "is_configured": true, 00:21:38.621 "data_offset": 2048, 00:21:38.621 "data_size": 63488 00:21:38.621 } 00:21:38.621 ] 00:21:38.621 }' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # killprocess 3929980 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@946 -- # '[' -z 3929980 ']' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # kill -0 3929980 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # uname 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3929980 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:38.621 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3929980' 00:21:38.622 killing process with pid 3929980 00:21:38.622 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@965 -- # kill 3929980 00:21:38.622 Received shutdown signal, test time was about 27.438511 seconds 00:21:38.622 00:21:38.622 Latency(us) 00:21:38.622 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:38.622 =================================================================================================================== 00:21:38.622 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:21:38.622 [2024-05-15 04:22:26.480041] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:38.622 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@970 -- # wait 3929980 00:21:38.622 [2024-05-15 04:22:26.480182] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:38.622 [2024-05-15 04:22:26.480269] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:38.622 [2024-05-15 04:22:26.480286] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c16740 name raid_bdev1, state offline 00:21:38.622 [2024-05-15 04:22:26.537385] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:38.880 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@785 -- # return 0 00:21:38.880 00:21:38.880 real 0m33.202s 00:21:38.880 user 0m53.177s 00:21:38.880 sys 0m4.153s 00:21:38.880 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:38.880 04:22:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:21:38.880 ************************************ 00:21:38.880 END TEST raid_rebuild_test_sb_io 00:21:38.880 ************************************ 00:21:38.880 04:22:26 bdev_raid -- bdev/bdev_raid.sh@818 -- # '[' n == y ']' 00:21:38.880 04:22:26 bdev_raid -- bdev/bdev_raid.sh@830 -- # base_blocklen=4096 00:21:38.880 04:22:26 bdev_raid -- bdev/bdev_raid.sh@832 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:21:38.880 04:22:26 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:21:38.880 04:22:26 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:38.880 04:22:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:38.880 ************************************ 00:21:38.880 START TEST raid_state_function_test_sb_4k 00:21:38.880 ************************************ 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:21:38.880 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # raid_pid=3934335 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3934335' 00:21:39.139 Process raid pid: 3934335 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@247 -- # waitforlisten 3934335 /var/tmp/spdk-raid.sock 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 3934335 ']' 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:39.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:39.139 04:22:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:39.139 [2024-05-15 04:22:26.945347] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:21:39.139 [2024-05-15 04:22:26.945420] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:39.139 [2024-05-15 04:22:27.030442] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:39.397 [2024-05-15 04:22:27.156864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:39.397 [2024-05-15 04:22:27.227534] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:39.397 [2024-05-15 04:22:27.227584] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:39.962 04:22:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:39.962 04:22:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:21:39.962 04:22:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:40.220 [2024-05-15 04:22:28.201212] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:40.221 [2024-05-15 04:22:28.201264] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:40.221 [2024-05-15 04:22:28.201277] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:40.221 [2024-05-15 04:22:28.201289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.221 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.478 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:40.478 "name": "Existed_Raid", 00:21:40.478 "uuid": "1a918fc7-80a8-43fb-881e-7c46616d2d97", 00:21:40.478 "strip_size_kb": 0, 00:21:40.478 "state": "configuring", 00:21:40.478 "raid_level": "raid1", 00:21:40.478 "superblock": true, 00:21:40.478 "num_base_bdevs": 2, 00:21:40.478 "num_base_bdevs_discovered": 0, 00:21:40.478 "num_base_bdevs_operational": 2, 00:21:40.478 "base_bdevs_list": [ 00:21:40.478 { 00:21:40.478 "name": "BaseBdev1", 00:21:40.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.478 "is_configured": false, 00:21:40.478 "data_offset": 0, 00:21:40.478 "data_size": 0 00:21:40.478 }, 00:21:40.479 { 00:21:40.479 "name": "BaseBdev2", 00:21:40.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.479 "is_configured": false, 00:21:40.479 "data_offset": 0, 00:21:40.479 "data_size": 0 00:21:40.479 } 00:21:40.479 ] 00:21:40.479 }' 00:21:40.479 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:40.479 04:22:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:41.412 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:41.412 [2024-05-15 04:22:29.340079] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:41.412 [2024-05-15 04:22:29.340117] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258c000 name Existed_Raid, state configuring 00:21:41.412 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:41.671 [2024-05-15 04:22:29.576731] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:41.671 [2024-05-15 04:22:29.576772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:41.671 [2024-05-15 04:22:29.576784] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:41.671 [2024-05-15 04:22:29.576796] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:41.671 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:21:41.928 [2024-05-15 04:22:29.825133] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:41.928 BaseBdev1 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:21:41.928 04:22:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:42.185 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:42.442 [ 00:21:42.442 { 00:21:42.442 "name": "BaseBdev1", 00:21:42.442 "aliases": [ 00:21:42.442 "0703044b-5a1e-4404-bfcc-e436dc5d5646" 00:21:42.442 ], 00:21:42.442 "product_name": "Malloc disk", 00:21:42.442 "block_size": 4096, 00:21:42.442 "num_blocks": 8192, 00:21:42.442 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:42.442 "assigned_rate_limits": { 00:21:42.442 "rw_ios_per_sec": 0, 00:21:42.442 "rw_mbytes_per_sec": 0, 00:21:42.442 "r_mbytes_per_sec": 0, 00:21:42.442 "w_mbytes_per_sec": 0 00:21:42.442 }, 00:21:42.442 "claimed": true, 00:21:42.442 "claim_type": "exclusive_write", 00:21:42.442 "zoned": false, 00:21:42.442 "supported_io_types": { 00:21:42.442 "read": true, 00:21:42.442 "write": true, 00:21:42.442 "unmap": true, 00:21:42.442 "write_zeroes": true, 00:21:42.442 "flush": true, 00:21:42.442 "reset": true, 00:21:42.442 "compare": false, 00:21:42.442 "compare_and_write": false, 00:21:42.442 "abort": true, 00:21:42.442 "nvme_admin": false, 00:21:42.442 "nvme_io": false 00:21:42.442 }, 00:21:42.442 "memory_domains": [ 00:21:42.442 { 00:21:42.442 "dma_device_id": "system", 00:21:42.442 "dma_device_type": 1 00:21:42.442 }, 00:21:42.442 { 00:21:42.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:42.442 "dma_device_type": 2 00:21:42.442 } 00:21:42.442 ], 00:21:42.442 "driver_specific": {} 00:21:42.442 } 00:21:42.442 ] 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.443 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.700 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:42.700 "name": "Existed_Raid", 00:21:42.700 "uuid": "bcffd397-9984-454e-a39b-2c5f3735cdbd", 00:21:42.700 "strip_size_kb": 0, 00:21:42.700 "state": "configuring", 00:21:42.700 "raid_level": "raid1", 00:21:42.700 "superblock": true, 00:21:42.700 "num_base_bdevs": 2, 00:21:42.700 "num_base_bdevs_discovered": 1, 00:21:42.700 "num_base_bdevs_operational": 2, 00:21:42.700 "base_bdevs_list": [ 00:21:42.700 { 00:21:42.700 "name": "BaseBdev1", 00:21:42.700 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:42.700 "is_configured": true, 00:21:42.700 "data_offset": 256, 00:21:42.700 "data_size": 7936 00:21:42.700 }, 00:21:42.700 { 00:21:42.700 "name": "BaseBdev2", 00:21:42.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:42.700 "is_configured": false, 00:21:42.700 "data_offset": 0, 00:21:42.700 "data_size": 0 00:21:42.700 } 00:21:42.700 ] 00:21:42.700 }' 00:21:42.700 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:42.700 04:22:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:43.265 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:43.522 [2024-05-15 04:22:31.421333] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:43.522 [2024-05-15 04:22:31.421392] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258b8f0 name Existed_Raid, state configuring 00:21:43.522 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:21:43.780 [2024-05-15 04:22:31.710148] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:43.780 [2024-05-15 04:22:31.711693] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:43.780 [2024-05-15 04:22:31.711731] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.780 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:44.037 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:44.037 "name": "Existed_Raid", 00:21:44.037 "uuid": "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9", 00:21:44.037 "strip_size_kb": 0, 00:21:44.037 "state": "configuring", 00:21:44.037 "raid_level": "raid1", 00:21:44.037 "superblock": true, 00:21:44.037 "num_base_bdevs": 2, 00:21:44.037 "num_base_bdevs_discovered": 1, 00:21:44.037 "num_base_bdevs_operational": 2, 00:21:44.037 "base_bdevs_list": [ 00:21:44.037 { 00:21:44.037 "name": "BaseBdev1", 00:21:44.037 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:44.037 "is_configured": true, 00:21:44.037 "data_offset": 256, 00:21:44.037 "data_size": 7936 00:21:44.037 }, 00:21:44.037 { 00:21:44.037 "name": "BaseBdev2", 00:21:44.037 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:44.037 "is_configured": false, 00:21:44.038 "data_offset": 0, 00:21:44.038 "data_size": 0 00:21:44.038 } 00:21:44.038 ] 00:21:44.038 }' 00:21:44.038 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:44.038 04:22:31 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:44.602 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:21:44.859 [2024-05-15 04:22:32.798282] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:44.859 [2024-05-15 04:22:32.798502] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x258c6e0 00:21:44.859 [2024-05-15 04:22:32.798521] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:44.859 [2024-05-15 04:22:32.798699] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x258dbb0 00:21:44.859 [2024-05-15 04:22:32.798862] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x258c6e0 00:21:44.859 [2024-05-15 04:22:32.798879] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x258c6e0 00:21:44.859 [2024-05-15 04:22:32.798990] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.859 BaseBdev2 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local i 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:21:44.859 04:22:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:45.116 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:45.375 [ 00:21:45.375 { 00:21:45.375 "name": "BaseBdev2", 00:21:45.375 "aliases": [ 00:21:45.375 "a2442bc7-7dee-4531-8ae0-acba612cbb84" 00:21:45.375 ], 00:21:45.375 "product_name": "Malloc disk", 00:21:45.375 "block_size": 4096, 00:21:45.375 "num_blocks": 8192, 00:21:45.375 "uuid": "a2442bc7-7dee-4531-8ae0-acba612cbb84", 00:21:45.375 "assigned_rate_limits": { 00:21:45.375 "rw_ios_per_sec": 0, 00:21:45.375 "rw_mbytes_per_sec": 0, 00:21:45.375 "r_mbytes_per_sec": 0, 00:21:45.375 "w_mbytes_per_sec": 0 00:21:45.375 }, 00:21:45.375 "claimed": true, 00:21:45.375 "claim_type": "exclusive_write", 00:21:45.375 "zoned": false, 00:21:45.375 "supported_io_types": { 00:21:45.375 "read": true, 00:21:45.375 "write": true, 00:21:45.375 "unmap": true, 00:21:45.375 "write_zeroes": true, 00:21:45.375 "flush": true, 00:21:45.375 "reset": true, 00:21:45.375 "compare": false, 00:21:45.375 "compare_and_write": false, 00:21:45.375 "abort": true, 00:21:45.375 "nvme_admin": false, 00:21:45.375 "nvme_io": false 00:21:45.375 }, 00:21:45.375 "memory_domains": [ 00:21:45.375 { 00:21:45.375 "dma_device_id": "system", 00:21:45.375 "dma_device_type": 1 00:21:45.375 }, 00:21:45.375 { 00:21:45.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.375 "dma_device_type": 2 00:21:45.375 } 00:21:45.375 ], 00:21:45.375 "driver_specific": {} 00:21:45.375 } 00:21:45.375 ] 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # return 0 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.375 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.633 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:45.633 "name": "Existed_Raid", 00:21:45.633 "uuid": "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9", 00:21:45.633 "strip_size_kb": 0, 00:21:45.633 "state": "online", 00:21:45.633 "raid_level": "raid1", 00:21:45.633 "superblock": true, 00:21:45.633 "num_base_bdevs": 2, 00:21:45.633 "num_base_bdevs_discovered": 2, 00:21:45.633 "num_base_bdevs_operational": 2, 00:21:45.633 "base_bdevs_list": [ 00:21:45.633 { 00:21:45.633 "name": "BaseBdev1", 00:21:45.633 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:45.633 "is_configured": true, 00:21:45.633 "data_offset": 256, 00:21:45.633 "data_size": 7936 00:21:45.633 }, 00:21:45.633 { 00:21:45.633 "name": "BaseBdev2", 00:21:45.633 "uuid": "a2442bc7-7dee-4531-8ae0-acba612cbb84", 00:21:45.633 "is_configured": true, 00:21:45.633 "data_offset": 256, 00:21:45.633 "data_size": 7936 00:21:45.633 } 00:21:45.633 ] 00:21:45.633 }' 00:21:45.633 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:45.633 04:22:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@199 -- # local name 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:46.199 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:21:46.456 [2024-05-15 04:22:34.358655] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:21:46.456 "name": "Existed_Raid", 00:21:46.456 "aliases": [ 00:21:46.456 "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9" 00:21:46.456 ], 00:21:46.456 "product_name": "Raid Volume", 00:21:46.456 "block_size": 4096, 00:21:46.456 "num_blocks": 7936, 00:21:46.456 "uuid": "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9", 00:21:46.456 "assigned_rate_limits": { 00:21:46.456 "rw_ios_per_sec": 0, 00:21:46.456 "rw_mbytes_per_sec": 0, 00:21:46.456 "r_mbytes_per_sec": 0, 00:21:46.456 "w_mbytes_per_sec": 0 00:21:46.456 }, 00:21:46.456 "claimed": false, 00:21:46.456 "zoned": false, 00:21:46.456 "supported_io_types": { 00:21:46.456 "read": true, 00:21:46.456 "write": true, 00:21:46.456 "unmap": false, 00:21:46.456 "write_zeroes": true, 00:21:46.456 "flush": false, 00:21:46.456 "reset": true, 00:21:46.456 "compare": false, 00:21:46.456 "compare_and_write": false, 00:21:46.456 "abort": false, 00:21:46.456 "nvme_admin": false, 00:21:46.456 "nvme_io": false 00:21:46.456 }, 00:21:46.456 "memory_domains": [ 00:21:46.456 { 00:21:46.456 "dma_device_id": "system", 00:21:46.456 "dma_device_type": 1 00:21:46.456 }, 00:21:46.456 { 00:21:46.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.456 "dma_device_type": 2 00:21:46.456 }, 00:21:46.456 { 00:21:46.456 "dma_device_id": "system", 00:21:46.456 "dma_device_type": 1 00:21:46.456 }, 00:21:46.456 { 00:21:46.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.456 "dma_device_type": 2 00:21:46.456 } 00:21:46.456 ], 00:21:46.456 "driver_specific": { 00:21:46.456 "raid": { 00:21:46.456 "uuid": "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9", 00:21:46.456 "strip_size_kb": 0, 00:21:46.456 "state": "online", 00:21:46.456 "raid_level": "raid1", 00:21:46.456 "superblock": true, 00:21:46.456 "num_base_bdevs": 2, 00:21:46.456 "num_base_bdevs_discovered": 2, 00:21:46.456 "num_base_bdevs_operational": 2, 00:21:46.456 "base_bdevs_list": [ 00:21:46.456 { 00:21:46.456 "name": "BaseBdev1", 00:21:46.456 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:46.456 "is_configured": true, 00:21:46.456 "data_offset": 256, 00:21:46.456 "data_size": 7936 00:21:46.456 }, 00:21:46.456 { 00:21:46.456 "name": "BaseBdev2", 00:21:46.456 "uuid": "a2442bc7-7dee-4531-8ae0-acba612cbb84", 00:21:46.456 "is_configured": true, 00:21:46.456 "data_offset": 256, 00:21:46.456 "data_size": 7936 00:21:46.456 } 00:21:46.456 ] 00:21:46.456 } 00:21:46.456 } 00:21:46.456 }' 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:21:46.456 BaseBdev2' 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:46.456 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:46.714 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:46.714 "name": "BaseBdev1", 00:21:46.714 "aliases": [ 00:21:46.714 "0703044b-5a1e-4404-bfcc-e436dc5d5646" 00:21:46.714 ], 00:21:46.714 "product_name": "Malloc disk", 00:21:46.714 "block_size": 4096, 00:21:46.714 "num_blocks": 8192, 00:21:46.714 "uuid": "0703044b-5a1e-4404-bfcc-e436dc5d5646", 00:21:46.714 "assigned_rate_limits": { 00:21:46.714 "rw_ios_per_sec": 0, 00:21:46.714 "rw_mbytes_per_sec": 0, 00:21:46.714 "r_mbytes_per_sec": 0, 00:21:46.714 "w_mbytes_per_sec": 0 00:21:46.714 }, 00:21:46.714 "claimed": true, 00:21:46.714 "claim_type": "exclusive_write", 00:21:46.714 "zoned": false, 00:21:46.714 "supported_io_types": { 00:21:46.714 "read": true, 00:21:46.714 "write": true, 00:21:46.714 "unmap": true, 00:21:46.714 "write_zeroes": true, 00:21:46.714 "flush": true, 00:21:46.714 "reset": true, 00:21:46.714 "compare": false, 00:21:46.714 "compare_and_write": false, 00:21:46.714 "abort": true, 00:21:46.714 "nvme_admin": false, 00:21:46.714 "nvme_io": false 00:21:46.714 }, 00:21:46.714 "memory_domains": [ 00:21:46.714 { 00:21:46.714 "dma_device_id": "system", 00:21:46.714 "dma_device_type": 1 00:21:46.714 }, 00:21:46.714 { 00:21:46.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.714 "dma_device_type": 2 00:21:46.714 } 00:21:46.714 ], 00:21:46.714 "driver_specific": {} 00:21:46.714 }' 00:21:46.714 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:46.714 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:46.972 04:22:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:47.229 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:47.229 "name": "BaseBdev2", 00:21:47.229 "aliases": [ 00:21:47.230 "a2442bc7-7dee-4531-8ae0-acba612cbb84" 00:21:47.230 ], 00:21:47.230 "product_name": "Malloc disk", 00:21:47.230 "block_size": 4096, 00:21:47.230 "num_blocks": 8192, 00:21:47.230 "uuid": "a2442bc7-7dee-4531-8ae0-acba612cbb84", 00:21:47.230 "assigned_rate_limits": { 00:21:47.230 "rw_ios_per_sec": 0, 00:21:47.230 "rw_mbytes_per_sec": 0, 00:21:47.230 "r_mbytes_per_sec": 0, 00:21:47.230 "w_mbytes_per_sec": 0 00:21:47.230 }, 00:21:47.230 "claimed": true, 00:21:47.230 "claim_type": "exclusive_write", 00:21:47.230 "zoned": false, 00:21:47.230 "supported_io_types": { 00:21:47.230 "read": true, 00:21:47.230 "write": true, 00:21:47.230 "unmap": true, 00:21:47.230 "write_zeroes": true, 00:21:47.230 "flush": true, 00:21:47.230 "reset": true, 00:21:47.230 "compare": false, 00:21:47.230 "compare_and_write": false, 00:21:47.230 "abort": true, 00:21:47.230 "nvme_admin": false, 00:21:47.230 "nvme_io": false 00:21:47.230 }, 00:21:47.230 "memory_domains": [ 00:21:47.230 { 00:21:47.230 "dma_device_id": "system", 00:21:47.230 "dma_device_type": 1 00:21:47.230 }, 00:21:47.230 { 00:21:47.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.230 "dma_device_type": 2 00:21:47.230 } 00:21:47.230 ], 00:21:47.230 "driver_specific": {} 00:21:47.230 }' 00:21:47.230 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:47.230 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:47.487 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:47.746 [2024-05-15 04:22:35.706083] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # local expected_state 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.746 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:48.004 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:48.004 "name": "Existed_Raid", 00:21:48.004 "uuid": "85e180a0-fa6c-4bd2-bdee-5a26bf98b0a9", 00:21:48.004 "strip_size_kb": 0, 00:21:48.004 "state": "online", 00:21:48.004 "raid_level": "raid1", 00:21:48.004 "superblock": true, 00:21:48.004 "num_base_bdevs": 2, 00:21:48.004 "num_base_bdevs_discovered": 1, 00:21:48.004 "num_base_bdevs_operational": 1, 00:21:48.004 "base_bdevs_list": [ 00:21:48.004 { 00:21:48.004 "name": null, 00:21:48.004 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:48.004 "is_configured": false, 00:21:48.004 "data_offset": 256, 00:21:48.004 "data_size": 7936 00:21:48.004 }, 00:21:48.004 { 00:21:48.004 "name": "BaseBdev2", 00:21:48.004 "uuid": "a2442bc7-7dee-4531-8ae0-acba612cbb84", 00:21:48.004 "is_configured": true, 00:21:48.004 "data_offset": 256, 00:21:48.004 "data_size": 7936 00:21:48.004 } 00:21:48.004 ] 00:21:48.004 }' 00:21:48.004 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:48.004 04:22:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:48.570 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:21:48.570 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:21:48.570 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.570 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:21:48.827 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:21:48.827 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:48.827 04:22:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:49.085 [2024-05-15 04:22:37.028815] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:49.085 [2024-05-15 04:22:37.028937] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:49.085 [2024-05-15 04:22:37.042216] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:49.085 [2024-05-15 04:22:37.042278] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:49.085 [2024-05-15 04:22:37.042302] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x258c6e0 name Existed_Raid, state offline 00:21:49.085 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:21:49.085 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:21:49.085 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.085 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@342 -- # killprocess 3934335 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 3934335 ']' 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 3934335 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:21:49.343 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3934335 00:21:49.601 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:21:49.601 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:21:49.601 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3934335' 00:21:49.601 killing process with pid 3934335 00:21:49.601 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@965 -- # kill 3934335 00:21:49.601 [2024-05-15 04:22:37.375920] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:49.601 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@970 -- # wait 3934335 00:21:49.601 [2024-05-15 04:22:37.377070] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:49.859 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@344 -- # return 0 00:21:49.859 00:21:49.859 real 0m10.767s 00:21:49.859 user 0m19.480s 00:21:49.859 sys 0m1.517s 00:21:49.859 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:21:49.859 04:22:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:21:49.859 ************************************ 00:21:49.859 END TEST raid_state_function_test_sb_4k 00:21:49.859 ************************************ 00:21:49.859 04:22:37 bdev_raid -- bdev/bdev_raid.sh@833 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:21:49.859 04:22:37 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:21:49.859 04:22:37 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:21:49.859 04:22:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:49.859 ************************************ 00:21:49.859 START TEST raid_superblock_test_4k 00:21:49.859 ************************************ 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size 00:21:49.859 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # raid_pid=3935864 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # waitforlisten 3935864 /var/tmp/spdk-raid.sock 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@827 -- # '[' -z 3935864 ']' 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:49.860 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:21:49.860 04:22:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:49.860 [2024-05-15 04:22:37.764998] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:21:49.860 [2024-05-15 04:22:37.765072] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3935864 ] 00:21:49.860 [2024-05-15 04:22:37.842124] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:50.118 [2024-05-15 04:22:37.952984] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:21:50.118 [2024-05-15 04:22:38.026159] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:50.118 [2024-05-15 04:22:38.026205] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # return 0 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:21:51.052 malloc1 00:21:51.052 04:22:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:51.310 [2024-05-15 04:22:39.242975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:51.310 [2024-05-15 04:22:39.243032] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.310 [2024-05-15 04:22:39.243061] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1466c20 00:21:51.310 [2024-05-15 04:22:39.243075] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.310 [2024-05-15 04:22:39.244913] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.310 [2024-05-15 04:22:39.244943] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:51.310 pt1 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:51.310 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:21:51.568 malloc2 00:21:51.568 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:51.827 [2024-05-15 04:22:39.752094] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:51.827 [2024-05-15 04:22:39.752168] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.827 [2024-05-15 04:22:39.752194] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145ec00 00:21:51.827 [2024-05-15 04:22:39.752207] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.827 [2024-05-15 04:22:39.753971] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.827 [2024-05-15 04:22:39.753996] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:51.827 pt2 00:21:51.827 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:21:51.827 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:21:51.827 04:22:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:21:52.086 [2024-05-15 04:22:39.996778] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:52.086 [2024-05-15 04:22:39.998036] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:52.086 [2024-05-15 04:22:39.998239] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x145f230 00:21:52.086 [2024-05-15 04:22:39.998253] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:52.086 [2024-05-15 04:22:39.998447] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x147db10 00:21:52.086 [2024-05-15 04:22:39.998600] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x145f230 00:21:52.086 [2024-05-15 04:22:39.998613] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x145f230 00:21:52.086 [2024-05-15 04:22:39.998733] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.086 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.344 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:52.344 "name": "raid_bdev1", 00:21:52.344 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:52.344 "strip_size_kb": 0, 00:21:52.344 "state": "online", 00:21:52.344 "raid_level": "raid1", 00:21:52.344 "superblock": true, 00:21:52.344 "num_base_bdevs": 2, 00:21:52.344 "num_base_bdevs_discovered": 2, 00:21:52.344 "num_base_bdevs_operational": 2, 00:21:52.344 "base_bdevs_list": [ 00:21:52.344 { 00:21:52.344 "name": "pt1", 00:21:52.344 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:52.344 "is_configured": true, 00:21:52.344 "data_offset": 256, 00:21:52.344 "data_size": 7936 00:21:52.344 }, 00:21:52.344 { 00:21:52.344 "name": "pt2", 00:21:52.344 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:52.344 "is_configured": true, 00:21:52.344 "data_offset": 256, 00:21:52.344 "data_size": 7936 00:21:52.344 } 00:21:52.344 ] 00:21:52.344 }' 00:21:52.344 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:52.344 04:22:40 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:52.910 04:22:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:21:53.168 [2024-05-15 04:22:41.075807] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.168 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:21:53.168 "name": "raid_bdev1", 00:21:53.168 "aliases": [ 00:21:53.168 "2db9bddf-48f7-44c5-84cc-5543e6f48963" 00:21:53.168 ], 00:21:53.168 "product_name": "Raid Volume", 00:21:53.168 "block_size": 4096, 00:21:53.168 "num_blocks": 7936, 00:21:53.168 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:53.168 "assigned_rate_limits": { 00:21:53.168 "rw_ios_per_sec": 0, 00:21:53.168 "rw_mbytes_per_sec": 0, 00:21:53.168 "r_mbytes_per_sec": 0, 00:21:53.168 "w_mbytes_per_sec": 0 00:21:53.168 }, 00:21:53.168 "claimed": false, 00:21:53.168 "zoned": false, 00:21:53.168 "supported_io_types": { 00:21:53.168 "read": true, 00:21:53.168 "write": true, 00:21:53.168 "unmap": false, 00:21:53.168 "write_zeroes": true, 00:21:53.168 "flush": false, 00:21:53.168 "reset": true, 00:21:53.168 "compare": false, 00:21:53.168 "compare_and_write": false, 00:21:53.168 "abort": false, 00:21:53.168 "nvme_admin": false, 00:21:53.168 "nvme_io": false 00:21:53.168 }, 00:21:53.168 "memory_domains": [ 00:21:53.168 { 00:21:53.168 "dma_device_id": "system", 00:21:53.168 "dma_device_type": 1 00:21:53.168 }, 00:21:53.168 { 00:21:53.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.168 "dma_device_type": 2 00:21:53.168 }, 00:21:53.168 { 00:21:53.168 "dma_device_id": "system", 00:21:53.168 "dma_device_type": 1 00:21:53.168 }, 00:21:53.168 { 00:21:53.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.168 "dma_device_type": 2 00:21:53.168 } 00:21:53.168 ], 00:21:53.168 "driver_specific": { 00:21:53.168 "raid": { 00:21:53.168 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:53.168 "strip_size_kb": 0, 00:21:53.168 "state": "online", 00:21:53.168 "raid_level": "raid1", 00:21:53.168 "superblock": true, 00:21:53.168 "num_base_bdevs": 2, 00:21:53.168 "num_base_bdevs_discovered": 2, 00:21:53.168 "num_base_bdevs_operational": 2, 00:21:53.168 "base_bdevs_list": [ 00:21:53.168 { 00:21:53.168 "name": "pt1", 00:21:53.168 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:53.168 "is_configured": true, 00:21:53.168 "data_offset": 256, 00:21:53.168 "data_size": 7936 00:21:53.168 }, 00:21:53.168 { 00:21:53.168 "name": "pt2", 00:21:53.168 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:53.168 "is_configured": true, 00:21:53.169 "data_offset": 256, 00:21:53.169 "data_size": 7936 00:21:53.169 } 00:21:53.169 ] 00:21:53.169 } 00:21:53.169 } 00:21:53.169 }' 00:21:53.169 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:53.169 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:21:53.169 pt2' 00:21:53.169 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:53.169 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:53.169 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:53.426 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:53.426 "name": "pt1", 00:21:53.426 "aliases": [ 00:21:53.426 "2b33c4af-8cee-5342-90fe-0d42e9a9c424" 00:21:53.426 ], 00:21:53.426 "product_name": "passthru", 00:21:53.426 "block_size": 4096, 00:21:53.426 "num_blocks": 8192, 00:21:53.426 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:53.426 "assigned_rate_limits": { 00:21:53.426 "rw_ios_per_sec": 0, 00:21:53.426 "rw_mbytes_per_sec": 0, 00:21:53.426 "r_mbytes_per_sec": 0, 00:21:53.426 "w_mbytes_per_sec": 0 00:21:53.426 }, 00:21:53.426 "claimed": true, 00:21:53.426 "claim_type": "exclusive_write", 00:21:53.426 "zoned": false, 00:21:53.426 "supported_io_types": { 00:21:53.426 "read": true, 00:21:53.426 "write": true, 00:21:53.426 "unmap": true, 00:21:53.426 "write_zeroes": true, 00:21:53.426 "flush": true, 00:21:53.426 "reset": true, 00:21:53.426 "compare": false, 00:21:53.426 "compare_and_write": false, 00:21:53.426 "abort": true, 00:21:53.426 "nvme_admin": false, 00:21:53.426 "nvme_io": false 00:21:53.426 }, 00:21:53.426 "memory_domains": [ 00:21:53.426 { 00:21:53.426 "dma_device_id": "system", 00:21:53.426 "dma_device_type": 1 00:21:53.426 }, 00:21:53.426 { 00:21:53.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.426 "dma_device_type": 2 00:21:53.426 } 00:21:53.426 ], 00:21:53.426 "driver_specific": { 00:21:53.426 "passthru": { 00:21:53.426 "name": "pt1", 00:21:53.426 "base_bdev_name": "malloc1" 00:21:53.426 } 00:21:53.426 } 00:21:53.426 }' 00:21:53.426 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:53.684 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:53.942 "name": "pt2", 00:21:53.942 "aliases": [ 00:21:53.942 "c4118c5e-acae-5169-8de0-dba3dedcd039" 00:21:53.942 ], 00:21:53.942 "product_name": "passthru", 00:21:53.942 "block_size": 4096, 00:21:53.942 "num_blocks": 8192, 00:21:53.942 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:53.942 "assigned_rate_limits": { 00:21:53.942 "rw_ios_per_sec": 0, 00:21:53.942 "rw_mbytes_per_sec": 0, 00:21:53.942 "r_mbytes_per_sec": 0, 00:21:53.942 "w_mbytes_per_sec": 0 00:21:53.942 }, 00:21:53.942 "claimed": true, 00:21:53.942 "claim_type": "exclusive_write", 00:21:53.942 "zoned": false, 00:21:53.942 "supported_io_types": { 00:21:53.942 "read": true, 00:21:53.942 "write": true, 00:21:53.942 "unmap": true, 00:21:53.942 "write_zeroes": true, 00:21:53.942 "flush": true, 00:21:53.942 "reset": true, 00:21:53.942 "compare": false, 00:21:53.942 "compare_and_write": false, 00:21:53.942 "abort": true, 00:21:53.942 "nvme_admin": false, 00:21:53.942 "nvme_io": false 00:21:53.942 }, 00:21:53.942 "memory_domains": [ 00:21:53.942 { 00:21:53.942 "dma_device_id": "system", 00:21:53.942 "dma_device_type": 1 00:21:53.942 }, 00:21:53.942 { 00:21:53.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.942 "dma_device_type": 2 00:21:53.942 } 00:21:53.942 ], 00:21:53.942 "driver_specific": { 00:21:53.942 "passthru": { 00:21:53.942 "name": "pt2", 00:21:53.942 "base_bdev_name": "malloc2" 00:21:53.942 } 00:21:53.942 } 00:21:53.942 }' 00:21:53.942 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:54.200 04:22:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.200 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:54.457 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:54.457 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:54.457 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:54.457 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:21:54.715 [2024-05-15 04:22:42.475549] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:54.715 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=2db9bddf-48f7-44c5-84cc-5543e6f48963 00:21:54.715 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # '[' -z 2db9bddf-48f7-44c5-84cc-5543e6f48963 ']' 00:21:54.715 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:54.972 [2024-05-15 04:22:42.735992] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:54.972 [2024-05-15 04:22:42.736018] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:54.972 [2024-05-15 04:22:42.736108] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:54.972 [2024-05-15 04:22:42.736210] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:54.972 [2024-05-15 04:22:42.736224] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x145f230 name raid_bdev1, state offline 00:21:54.972 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.972 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:21:55.230 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:21:55.230 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:21:55.230 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:21:55.230 04:22:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:55.230 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:21:55.230 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:55.487 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:55.487 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:55.745 04:22:43 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:21:56.003 [2024-05-15 04:22:44.011440] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:56.003 [2024-05-15 04:22:44.012670] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:56.003 [2024-05-15 04:22:44.012729] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:56.003 [2024-05-15 04:22:44.012778] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:56.003 [2024-05-15 04:22:44.012813] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:56.003 [2024-05-15 04:22:44.012833] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1467ef0 name raid_bdev1, state configuring 00:21:56.003 request: 00:21:56.003 { 00:21:56.003 "name": "raid_bdev1", 00:21:56.003 "raid_level": "raid1", 00:21:56.003 "base_bdevs": [ 00:21:56.003 "malloc1", 00:21:56.003 "malloc2" 00:21:56.003 ], 00:21:56.003 "superblock": false, 00:21:56.003 "method": "bdev_raid_create", 00:21:56.003 "req_id": 1 00:21:56.003 } 00:21:56.003 Got JSON-RPC error response 00:21:56.003 response: 00:21:56.003 { 00:21:56.003 "code": -17, 00:21:56.003 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:56.003 } 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.260 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:21:56.517 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:21:56.517 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:21:56.517 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:56.517 [2024-05-15 04:22:44.512728] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:56.517 [2024-05-15 04:22:44.512792] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.517 [2024-05-15 04:22:44.512820] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145f920 00:21:56.517 [2024-05-15 04:22:44.512846] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.517 [2024-05-15 04:22:44.514638] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.517 [2024-05-15 04:22:44.514667] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:56.517 [2024-05-15 04:22:44.514755] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:56.517 [2024-05-15 04:22:44.514797] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:56.517 pt1 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:56.774 "name": "raid_bdev1", 00:21:56.774 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:56.774 "strip_size_kb": 0, 00:21:56.774 "state": "configuring", 00:21:56.774 "raid_level": "raid1", 00:21:56.774 "superblock": true, 00:21:56.774 "num_base_bdevs": 2, 00:21:56.774 "num_base_bdevs_discovered": 1, 00:21:56.774 "num_base_bdevs_operational": 2, 00:21:56.774 "base_bdevs_list": [ 00:21:56.774 { 00:21:56.774 "name": "pt1", 00:21:56.774 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:56.774 "is_configured": true, 00:21:56.774 "data_offset": 256, 00:21:56.774 "data_size": 7936 00:21:56.774 }, 00:21:56.774 { 00:21:56.774 "name": null, 00:21:56.774 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:56.774 "is_configured": false, 00:21:56.774 "data_offset": 256, 00:21:56.774 "data_size": 7936 00:21:56.774 } 00:21:56.774 ] 00:21:56.774 }' 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:56.774 04:22:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:57.705 [2024-05-15 04:22:45.583587] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:57.705 [2024-05-15 04:22:45.583660] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.705 [2024-05-15 04:22:45.583687] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x145ee30 00:21:57.705 [2024-05-15 04:22:45.583703] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.705 [2024-05-15 04:22:45.584118] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.705 [2024-05-15 04:22:45.584145] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:57.705 [2024-05-15 04:22:45.584232] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:57.705 [2024-05-15 04:22:45.584262] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:57.705 [2024-05-15 04:22:45.584388] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x14646c0 00:21:57.705 [2024-05-15 04:22:45.584404] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:21:57.705 [2024-05-15 04:22:45.584575] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x147db10 00:21:57.705 [2024-05-15 04:22:45.584730] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14646c0 00:21:57.705 [2024-05-15 04:22:45.584746] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14646c0 00:21:57.705 [2024-05-15 04:22:45.584874] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:57.705 pt2 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.705 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:57.963 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:21:57.963 "name": "raid_bdev1", 00:21:57.963 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:57.963 "strip_size_kb": 0, 00:21:57.963 "state": "online", 00:21:57.963 "raid_level": "raid1", 00:21:57.963 "superblock": true, 00:21:57.963 "num_base_bdevs": 2, 00:21:57.963 "num_base_bdevs_discovered": 2, 00:21:57.963 "num_base_bdevs_operational": 2, 00:21:57.963 "base_bdevs_list": [ 00:21:57.963 { 00:21:57.963 "name": "pt1", 00:21:57.963 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:57.963 "is_configured": true, 00:21:57.963 "data_offset": 256, 00:21:57.963 "data_size": 7936 00:21:57.963 }, 00:21:57.964 { 00:21:57.964 "name": "pt2", 00:21:57.964 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:57.964 "is_configured": true, 00:21:57.964 "data_offset": 256, 00:21:57.964 "data_size": 7936 00:21:57.964 } 00:21:57.964 ] 00:21:57.964 }' 00:21:57.964 04:22:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:21:57.964 04:22:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@199 -- # local name 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:58.563 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:21:58.861 [2024-05-15 04:22:46.654638] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:21:58.861 "name": "raid_bdev1", 00:21:58.861 "aliases": [ 00:21:58.861 "2db9bddf-48f7-44c5-84cc-5543e6f48963" 00:21:58.861 ], 00:21:58.861 "product_name": "Raid Volume", 00:21:58.861 "block_size": 4096, 00:21:58.861 "num_blocks": 7936, 00:21:58.861 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:58.861 "assigned_rate_limits": { 00:21:58.861 "rw_ios_per_sec": 0, 00:21:58.861 "rw_mbytes_per_sec": 0, 00:21:58.861 "r_mbytes_per_sec": 0, 00:21:58.861 "w_mbytes_per_sec": 0 00:21:58.861 }, 00:21:58.861 "claimed": false, 00:21:58.861 "zoned": false, 00:21:58.861 "supported_io_types": { 00:21:58.861 "read": true, 00:21:58.861 "write": true, 00:21:58.861 "unmap": false, 00:21:58.861 "write_zeroes": true, 00:21:58.861 "flush": false, 00:21:58.861 "reset": true, 00:21:58.861 "compare": false, 00:21:58.861 "compare_and_write": false, 00:21:58.861 "abort": false, 00:21:58.861 "nvme_admin": false, 00:21:58.861 "nvme_io": false 00:21:58.861 }, 00:21:58.861 "memory_domains": [ 00:21:58.861 { 00:21:58.861 "dma_device_id": "system", 00:21:58.861 "dma_device_type": 1 00:21:58.861 }, 00:21:58.861 { 00:21:58.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.861 "dma_device_type": 2 00:21:58.861 }, 00:21:58.861 { 00:21:58.861 "dma_device_id": "system", 00:21:58.861 "dma_device_type": 1 00:21:58.861 }, 00:21:58.861 { 00:21:58.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.861 "dma_device_type": 2 00:21:58.861 } 00:21:58.861 ], 00:21:58.861 "driver_specific": { 00:21:58.861 "raid": { 00:21:58.861 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:21:58.861 "strip_size_kb": 0, 00:21:58.861 "state": "online", 00:21:58.861 "raid_level": "raid1", 00:21:58.861 "superblock": true, 00:21:58.861 "num_base_bdevs": 2, 00:21:58.861 "num_base_bdevs_discovered": 2, 00:21:58.861 "num_base_bdevs_operational": 2, 00:21:58.861 "base_bdevs_list": [ 00:21:58.861 { 00:21:58.861 "name": "pt1", 00:21:58.861 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:58.861 "is_configured": true, 00:21:58.861 "data_offset": 256, 00:21:58.861 "data_size": 7936 00:21:58.861 }, 00:21:58.861 { 00:21:58.861 "name": "pt2", 00:21:58.861 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:58.861 "is_configured": true, 00:21:58.861 "data_offset": 256, 00:21:58.861 "data_size": 7936 00:21:58.861 } 00:21:58.861 ] 00:21:58.861 } 00:21:58.861 } 00:21:58.861 }' 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:21:58.861 pt2' 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:58.861 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:59.119 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:59.119 "name": "pt1", 00:21:59.119 "aliases": [ 00:21:59.119 "2b33c4af-8cee-5342-90fe-0d42e9a9c424" 00:21:59.119 ], 00:21:59.119 "product_name": "passthru", 00:21:59.119 "block_size": 4096, 00:21:59.119 "num_blocks": 8192, 00:21:59.119 "uuid": "2b33c4af-8cee-5342-90fe-0d42e9a9c424", 00:21:59.119 "assigned_rate_limits": { 00:21:59.119 "rw_ios_per_sec": 0, 00:21:59.119 "rw_mbytes_per_sec": 0, 00:21:59.119 "r_mbytes_per_sec": 0, 00:21:59.119 "w_mbytes_per_sec": 0 00:21:59.119 }, 00:21:59.119 "claimed": true, 00:21:59.119 "claim_type": "exclusive_write", 00:21:59.119 "zoned": false, 00:21:59.119 "supported_io_types": { 00:21:59.119 "read": true, 00:21:59.119 "write": true, 00:21:59.119 "unmap": true, 00:21:59.119 "write_zeroes": true, 00:21:59.119 "flush": true, 00:21:59.119 "reset": true, 00:21:59.119 "compare": false, 00:21:59.119 "compare_and_write": false, 00:21:59.119 "abort": true, 00:21:59.119 "nvme_admin": false, 00:21:59.119 "nvme_io": false 00:21:59.119 }, 00:21:59.119 "memory_domains": [ 00:21:59.119 { 00:21:59.119 "dma_device_id": "system", 00:21:59.119 "dma_device_type": 1 00:21:59.119 }, 00:21:59.119 { 00:21:59.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.119 "dma_device_type": 2 00:21:59.119 } 00:21:59.119 ], 00:21:59.119 "driver_specific": { 00:21:59.119 "passthru": { 00:21:59.119 "name": "pt1", 00:21:59.119 "base_bdev_name": "malloc1" 00:21:59.119 } 00:21:59.119 } 00:21:59.119 }' 00:21:59.119 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:59.119 04:22:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:59.119 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:59.119 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:59.119 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:59.119 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.119 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:59.378 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:21:59.636 "name": "pt2", 00:21:59.636 "aliases": [ 00:21:59.636 "c4118c5e-acae-5169-8de0-dba3dedcd039" 00:21:59.636 ], 00:21:59.636 "product_name": "passthru", 00:21:59.636 "block_size": 4096, 00:21:59.636 "num_blocks": 8192, 00:21:59.636 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:21:59.636 "assigned_rate_limits": { 00:21:59.636 "rw_ios_per_sec": 0, 00:21:59.636 "rw_mbytes_per_sec": 0, 00:21:59.636 "r_mbytes_per_sec": 0, 00:21:59.636 "w_mbytes_per_sec": 0 00:21:59.636 }, 00:21:59.636 "claimed": true, 00:21:59.636 "claim_type": "exclusive_write", 00:21:59.636 "zoned": false, 00:21:59.636 "supported_io_types": { 00:21:59.636 "read": true, 00:21:59.636 "write": true, 00:21:59.636 "unmap": true, 00:21:59.636 "write_zeroes": true, 00:21:59.636 "flush": true, 00:21:59.636 "reset": true, 00:21:59.636 "compare": false, 00:21:59.636 "compare_and_write": false, 00:21:59.636 "abort": true, 00:21:59.636 "nvme_admin": false, 00:21:59.636 "nvme_io": false 00:21:59.636 }, 00:21:59.636 "memory_domains": [ 00:21:59.636 { 00:21:59.636 "dma_device_id": "system", 00:21:59.636 "dma_device_type": 1 00:21:59.636 }, 00:21:59.636 { 00:21:59.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.636 "dma_device_type": 2 00:21:59.636 } 00:21:59.636 ], 00:21:59.636 "driver_specific": { 00:21:59.636 "passthru": { 00:21:59.636 "name": "pt2", 00:21:59.636 "base_bdev_name": "malloc2" 00:21:59.636 } 00:21:59.636 } 00:21:59.636 }' 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.636 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@209 -- # [[ null == null ]] 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:59.894 04:22:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:22:00.152 [2024-05-15 04:22:48.010278] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:00.152 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@487 -- # '[' 2db9bddf-48f7-44c5-84cc-5543e6f48963 '!=' 2db9bddf-48f7-44c5-84cc-5543e6f48963 ']' 00:22:00.152 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:22:00.152 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # case $1 in 00:22:00.152 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@215 -- # return 0 00:22:00.152 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:00.410 [2024-05-15 04:22:48.306892] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.410 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.669 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:00.669 "name": "raid_bdev1", 00:22:00.669 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:22:00.669 "strip_size_kb": 0, 00:22:00.669 "state": "online", 00:22:00.669 "raid_level": "raid1", 00:22:00.669 "superblock": true, 00:22:00.669 "num_base_bdevs": 2, 00:22:00.669 "num_base_bdevs_discovered": 1, 00:22:00.669 "num_base_bdevs_operational": 1, 00:22:00.669 "base_bdevs_list": [ 00:22:00.669 { 00:22:00.669 "name": null, 00:22:00.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.669 "is_configured": false, 00:22:00.669 "data_offset": 256, 00:22:00.669 "data_size": 7936 00:22:00.669 }, 00:22:00.669 { 00:22:00.669 "name": "pt2", 00:22:00.669 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:22:00.669 "is_configured": true, 00:22:00.669 "data_offset": 256, 00:22:00.669 "data_size": 7936 00:22:00.669 } 00:22:00.669 ] 00:22:00.669 }' 00:22:00.669 04:22:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:00.669 04:22:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:01.235 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:01.492 [2024-05-15 04:22:49.401743] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:01.492 [2024-05-15 04:22:49.401774] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:01.492 [2024-05-15 04:22:49.401862] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:01.492 [2024-05-15 04:22:49.401918] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:01.492 [2024-05-15 04:22:49.401932] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14646c0 name raid_bdev1, state offline 00:22:01.492 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.492 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:22:01.750 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:22:01.750 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:22:01.750 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:22:01.750 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:22:01.750 04:22:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # i=1 00:22:02.008 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:02.266 [2024-05-15 04:22:50.251958] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:02.266 [2024-05-15 04:22:50.252035] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.266 [2024-05-15 04:22:50.252062] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1460650 00:22:02.266 [2024-05-15 04:22:50.252078] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.266 [2024-05-15 04:22:50.253819] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.266 [2024-05-15 04:22:50.253858] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:02.266 [2024-05-15 04:22:50.253945] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:02.266 [2024-05-15 04:22:50.253987] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:02.266 [2024-05-15 04:22:50.254106] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1463d80 00:22:02.266 [2024-05-15 04:22:50.254123] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:02.266 [2024-05-15 04:22:50.254295] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1464cf0 00:22:02.266 [2024-05-15 04:22:50.254455] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1463d80 00:22:02.266 [2024-05-15 04:22:50.254471] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1463d80 00:22:02.266 [2024-05-15 04:22:50.254583] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:02.266 pt2 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:02.266 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:02.524 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:02.524 "name": "raid_bdev1", 00:22:02.524 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:22:02.524 "strip_size_kb": 0, 00:22:02.524 "state": "online", 00:22:02.524 "raid_level": "raid1", 00:22:02.524 "superblock": true, 00:22:02.524 "num_base_bdevs": 2, 00:22:02.524 "num_base_bdevs_discovered": 1, 00:22:02.524 "num_base_bdevs_operational": 1, 00:22:02.524 "base_bdevs_list": [ 00:22:02.524 { 00:22:02.524 "name": null, 00:22:02.524 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:02.524 "is_configured": false, 00:22:02.524 "data_offset": 256, 00:22:02.524 "data_size": 7936 00:22:02.524 }, 00:22:02.524 { 00:22:02.524 "name": "pt2", 00:22:02.524 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:22:02.524 "is_configured": true, 00:22:02.524 "data_offset": 256, 00:22:02.524 "data_size": 7936 00:22:02.524 } 00:22:02.524 ] 00:22:02.524 }' 00:22:02.524 04:22:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:02.524 04:22:50 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:03.091 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:03.348 [2024-05-15 04:22:51.330794] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:03.348 [2024-05-15 04:22:51.330844] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:03.348 [2024-05-15 04:22:51.330934] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:03.348 [2024-05-15 04:22:51.330991] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:03.348 [2024-05-15 04:22:51.331008] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1463d80 name raid_bdev1, state offline 00:22:03.348 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.348 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@532 -- # '[' 2 -gt 2 ']' 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:03.914 [2024-05-15 04:22:51.896272] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:03.914 [2024-05-15 04:22:51.896335] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.914 [2024-05-15 04:22:51.896364] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1466180 00:22:03.914 [2024-05-15 04:22:51.896380] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.914 [2024-05-15 04:22:51.898176] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.914 [2024-05-15 04:22:51.898205] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:03.914 [2024-05-15 04:22:51.898304] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:03.914 [2024-05-15 04:22:51.898350] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:03.914 [2024-05-15 04:22:51.898482] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:03.914 [2024-05-15 04:22:51.898502] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:03.914 [2024-05-15 04:22:51.898521] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1464fa0 name raid_bdev1, state configuring 00:22:03.914 [2024-05-15 04:22:51.898551] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:03.914 [2024-05-15 04:22:51.898645] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1465c30 00:22:03.914 [2024-05-15 04:22:51.898662] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:03.914 [2024-05-15 04:22:51.898842] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1464690 00:22:03.914 [2024-05-15 04:22:51.898998] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1465c30 00:22:03.914 [2024-05-15 04:22:51.899014] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1465c30 00:22:03.914 [2024-05-15 04:22:51.899122] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:03.914 pt1 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # '[' 2 -gt 2 ']' 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:03.914 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.915 04:22:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.481 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:04.481 "name": "raid_bdev1", 00:22:04.481 "uuid": "2db9bddf-48f7-44c5-84cc-5543e6f48963", 00:22:04.481 "strip_size_kb": 0, 00:22:04.481 "state": "online", 00:22:04.481 "raid_level": "raid1", 00:22:04.481 "superblock": true, 00:22:04.481 "num_base_bdevs": 2, 00:22:04.481 "num_base_bdevs_discovered": 1, 00:22:04.481 "num_base_bdevs_operational": 1, 00:22:04.481 "base_bdevs_list": [ 00:22:04.481 { 00:22:04.481 "name": null, 00:22:04.481 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.481 "is_configured": false, 00:22:04.481 "data_offset": 256, 00:22:04.481 "data_size": 7936 00:22:04.481 }, 00:22:04.481 { 00:22:04.481 "name": "pt2", 00:22:04.481 "uuid": "c4118c5e-acae-5169-8de0-dba3dedcd039", 00:22:04.481 "is_configured": true, 00:22:04.481 "data_offset": 256, 00:22:04.481 "data_size": 7936 00:22:04.481 } 00:22:04.481 ] 00:22:04.481 }' 00:22:04.481 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:04.481 04:22:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:04.739 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:04.739 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:04.997 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:22:04.997 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:04.997 04:22:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:22:05.255 [2024-05-15 04:22:53.211927] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@558 -- # '[' 2db9bddf-48f7-44c5-84cc-5543e6f48963 '!=' 2db9bddf-48f7-44c5-84cc-5543e6f48963 ']' 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@563 -- # killprocess 3935864 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@946 -- # '[' -z 3935864 ']' 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # kill -0 3935864 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # uname 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3935864 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3935864' 00:22:05.255 killing process with pid 3935864 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@965 -- # kill 3935864 00:22:05.255 [2024-05-15 04:22:53.260177] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:05.255 [2024-05-15 04:22:53.260269] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:05.255 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@970 -- # wait 3935864 00:22:05.255 [2024-05-15 04:22:53.260337] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:05.255 [2024-05-15 04:22:53.260354] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1465c30 name raid_bdev1, state offline 00:22:05.512 [2024-05-15 04:22:53.282829] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:05.770 04:22:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@565 -- # return 0 00:22:05.770 00:22:05.770 real 0m15.849s 00:22:05.770 user 0m29.213s 00:22:05.770 sys 0m2.187s 00:22:05.770 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:05.770 04:22:53 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:22:05.770 ************************************ 00:22:05.770 END TEST raid_superblock_test_4k 00:22:05.770 ************************************ 00:22:05.770 04:22:53 bdev_raid -- bdev/bdev_raid.sh@834 -- # '[' true = true ']' 00:22:05.770 04:22:53 bdev_raid -- bdev/bdev_raid.sh@835 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:22:05.770 04:22:53 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:22:05.770 04:22:53 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:05.770 04:22:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:05.770 ************************************ 00:22:05.770 START TEST raid_rebuild_test_sb_4k 00:22:05.770 ************************************ 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local verify=true 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local strip_size 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local create_arg 00:22:05.770 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@579 -- # local data_offset 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # raid_pid=3938033 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@598 -- # waitforlisten 3938033 /var/tmp/spdk-raid.sock 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@827 -- # '[' -z 3938033 ']' 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:05.771 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:05.771 04:22:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:05.771 [2024-05-15 04:22:53.674988] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:22:05.771 [2024-05-15 04:22:53.675065] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3938033 ] 00:22:05.771 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:05.771 Zero copy mechanism will not be used. 00:22:05.771 [2024-05-15 04:22:53.750885] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:06.028 [2024-05-15 04:22:53.861343] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:06.028 [2024-05-15 04:22:53.933808] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:06.028 [2024-05-15 04:22:53.933867] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:06.962 04:22:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:06.962 04:22:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # return 0 00:22:06.962 04:22:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:22:06.962 04:22:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:22:06.962 BaseBdev1_malloc 00:22:06.962 04:22:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:07.220 [2024-05-15 04:22:55.192830] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:07.220 [2024-05-15 04:22:55.192890] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.220 [2024-05-15 04:22:55.192921] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151a000 00:22:07.220 [2024-05-15 04:22:55.192937] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.220 [2024-05-15 04:22:55.194722] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.220 [2024-05-15 04:22:55.194753] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:07.220 BaseBdev1 00:22:07.220 04:22:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:22:07.220 04:22:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:22:07.787 BaseBdev2_malloc 00:22:07.787 04:22:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:07.787 [2024-05-15 04:22:55.778503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:07.787 [2024-05-15 04:22:55.778570] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.787 [2024-05-15 04:22:55.778600] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c52c0 00:22:07.787 [2024-05-15 04:22:55.778616] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.787 [2024-05-15 04:22:55.780395] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.787 [2024-05-15 04:22:55.780425] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:07.787 BaseBdev2 00:22:07.787 04:22:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:22:08.045 spare_malloc 00:22:08.045 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:08.611 spare_delay 00:22:08.611 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:08.611 [2024-05-15 04:22:56.595552] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:08.611 [2024-05-15 04:22:56.595615] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.611 [2024-05-15 04:22:56.595641] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c9100 00:22:08.611 [2024-05-15 04:22:56.595657] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.611 [2024-05-15 04:22:56.597379] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.611 [2024-05-15 04:22:56.597409] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:08.611 spare 00:22:08.611 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:08.869 [2024-05-15 04:22:56.880340] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:08.869 [2024-05-15 04:22:56.881788] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:08.869 [2024-05-15 04:22:56.882004] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c9930 00:22:08.869 [2024-05-15 04:22:56.882023] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:08.869 [2024-05-15 04:22:56.882255] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15190f0 00:22:08.869 [2024-05-15 04:22:56.882437] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c9930 00:22:08.869 [2024-05-15 04:22:56.882453] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c9930 00:22:08.869 [2024-05-15 04:22:56.882595] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.128 04:22:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.386 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:09.386 "name": "raid_bdev1", 00:22:09.386 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:09.386 "strip_size_kb": 0, 00:22:09.386 "state": "online", 00:22:09.386 "raid_level": "raid1", 00:22:09.386 "superblock": true, 00:22:09.386 "num_base_bdevs": 2, 00:22:09.386 "num_base_bdevs_discovered": 2, 00:22:09.386 "num_base_bdevs_operational": 2, 00:22:09.386 "base_bdevs_list": [ 00:22:09.386 { 00:22:09.386 "name": "BaseBdev1", 00:22:09.386 "uuid": "b19464bf-6746-5748-be1c-efafadc52bcc", 00:22:09.386 "is_configured": true, 00:22:09.386 "data_offset": 256, 00:22:09.386 "data_size": 7936 00:22:09.386 }, 00:22:09.386 { 00:22:09.386 "name": "BaseBdev2", 00:22:09.386 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:09.386 "is_configured": true, 00:22:09.386 "data_offset": 256, 00:22:09.386 "data_size": 7936 00:22:09.386 } 00:22:09.386 ] 00:22:09.386 }' 00:22:09.386 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:09.386 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:09.951 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:09.951 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:22:09.951 [2024-05-15 04:22:57.967367] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.209 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=7936 00:22:10.209 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.209 04:22:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@619 -- # data_offset=256 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:10.467 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:10.467 [2024-05-15 04:22:58.460523] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1519600 00:22:10.467 /dev/nbd0 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:10.725 1+0 records in 00:22:10.725 1+0 records out 00:22:10.725 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000167055 s, 24.5 MB/s 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:22:10.725 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:22:10.726 04:22:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:22:11.291 7936+0 records in 00:22:11.291 7936+0 records out 00:22:11.291 32505856 bytes (33 MB, 31 MiB) copied, 0.685143 s, 47.4 MB/s 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:11.291 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:11.550 [2024-05-15 04:22:59.475600] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:11.550 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:11.807 [2024-05-15 04:22:59.709005] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.807 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.065 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:12.065 "name": "raid_bdev1", 00:22:12.065 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:12.065 "strip_size_kb": 0, 00:22:12.065 "state": "online", 00:22:12.065 "raid_level": "raid1", 00:22:12.065 "superblock": true, 00:22:12.065 "num_base_bdevs": 2, 00:22:12.065 "num_base_bdevs_discovered": 1, 00:22:12.065 "num_base_bdevs_operational": 1, 00:22:12.065 "base_bdevs_list": [ 00:22:12.065 { 00:22:12.065 "name": null, 00:22:12.065 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:12.065 "is_configured": false, 00:22:12.065 "data_offset": 256, 00:22:12.065 "data_size": 7936 00:22:12.066 }, 00:22:12.066 { 00:22:12.066 "name": "BaseBdev2", 00:22:12.066 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:12.066 "is_configured": true, 00:22:12.066 "data_offset": 256, 00:22:12.066 "data_size": 7936 00:22:12.066 } 00:22:12.066 ] 00:22:12.066 }' 00:22:12.066 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:12.066 04:22:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:12.631 04:23:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:12.889 [2024-05-15 04:23:00.739783] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:12.889 [2024-05-15 04:23:00.746334] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c9390 00:22:12.889 [2024-05-15 04:23:00.748497] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:12.889 04:23:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@647 -- # sleep 1 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.823 04:23:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.081 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:14.082 "name": "raid_bdev1", 00:22:14.082 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:14.082 "strip_size_kb": 0, 00:22:14.082 "state": "online", 00:22:14.082 "raid_level": "raid1", 00:22:14.082 "superblock": true, 00:22:14.082 "num_base_bdevs": 2, 00:22:14.082 "num_base_bdevs_discovered": 2, 00:22:14.082 "num_base_bdevs_operational": 2, 00:22:14.082 "process": { 00:22:14.082 "type": "rebuild", 00:22:14.082 "target": "spare", 00:22:14.082 "progress": { 00:22:14.082 "blocks": 3072, 00:22:14.082 "percent": 38 00:22:14.082 } 00:22:14.082 }, 00:22:14.082 "base_bdevs_list": [ 00:22:14.082 { 00:22:14.082 "name": "spare", 00:22:14.082 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:14.082 "is_configured": true, 00:22:14.082 "data_offset": 256, 00:22:14.082 "data_size": 7936 00:22:14.082 }, 00:22:14.082 { 00:22:14.082 "name": "BaseBdev2", 00:22:14.082 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:14.082 "is_configured": true, 00:22:14.082 "data_offset": 256, 00:22:14.082 "data_size": 7936 00:22:14.082 } 00:22:14.082 ] 00:22:14.082 }' 00:22:14.082 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:14.082 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.082 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:14.339 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.339 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:14.598 [2024-05-15 04:23:02.383383] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:14.598 [2024-05-15 04:23:02.462420] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:14.598 [2024-05-15 04:23:02.462484] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.598 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.856 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:14.856 "name": "raid_bdev1", 00:22:14.856 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:14.856 "strip_size_kb": 0, 00:22:14.856 "state": "online", 00:22:14.856 "raid_level": "raid1", 00:22:14.856 "superblock": true, 00:22:14.856 "num_base_bdevs": 2, 00:22:14.856 "num_base_bdevs_discovered": 1, 00:22:14.856 "num_base_bdevs_operational": 1, 00:22:14.856 "base_bdevs_list": [ 00:22:14.856 { 00:22:14.856 "name": null, 00:22:14.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.856 "is_configured": false, 00:22:14.856 "data_offset": 256, 00:22:14.856 "data_size": 7936 00:22:14.856 }, 00:22:14.856 { 00:22:14.856 "name": "BaseBdev2", 00:22:14.856 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:14.856 "is_configured": true, 00:22:14.856 "data_offset": 256, 00:22:14.856 "data_size": 7936 00:22:14.856 } 00:22:14.856 ] 00:22:14.856 }' 00:22:14.856 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:14.856 04:23:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:15.421 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:15.421 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:15.421 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:15.421 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:15.422 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:15.422 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.422 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:15.679 "name": "raid_bdev1", 00:22:15.679 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:15.679 "strip_size_kb": 0, 00:22:15.679 "state": "online", 00:22:15.679 "raid_level": "raid1", 00:22:15.679 "superblock": true, 00:22:15.679 "num_base_bdevs": 2, 00:22:15.679 "num_base_bdevs_discovered": 1, 00:22:15.679 "num_base_bdevs_operational": 1, 00:22:15.679 "base_bdevs_list": [ 00:22:15.679 { 00:22:15.679 "name": null, 00:22:15.679 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:15.679 "is_configured": false, 00:22:15.679 "data_offset": 256, 00:22:15.679 "data_size": 7936 00:22:15.679 }, 00:22:15.679 { 00:22:15.679 "name": "BaseBdev2", 00:22:15.679 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:15.679 "is_configured": true, 00:22:15.679 "data_offset": 256, 00:22:15.679 "data_size": 7936 00:22:15.679 } 00:22:15.679 ] 00:22:15.679 }' 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:15.679 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:15.937 [2024-05-15 04:23:03.876317] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:15.937 [2024-05-15 04:23:03.882743] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c7d90 00:22:15.937 [2024-05-15 04:23:03.884286] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:15.937 04:23:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # sleep 1 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.309 04:23:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:17.309 "name": "raid_bdev1", 00:22:17.309 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:17.309 "strip_size_kb": 0, 00:22:17.309 "state": "online", 00:22:17.309 "raid_level": "raid1", 00:22:17.309 "superblock": true, 00:22:17.309 "num_base_bdevs": 2, 00:22:17.309 "num_base_bdevs_discovered": 2, 00:22:17.309 "num_base_bdevs_operational": 2, 00:22:17.309 "process": { 00:22:17.309 "type": "rebuild", 00:22:17.309 "target": "spare", 00:22:17.309 "progress": { 00:22:17.309 "blocks": 3072, 00:22:17.309 "percent": 38 00:22:17.309 } 00:22:17.309 }, 00:22:17.309 "base_bdevs_list": [ 00:22:17.309 { 00:22:17.309 "name": "spare", 00:22:17.309 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:17.309 "is_configured": true, 00:22:17.309 "data_offset": 256, 00:22:17.309 "data_size": 7936 00:22:17.309 }, 00:22:17.309 { 00:22:17.309 "name": "BaseBdev2", 00:22:17.309 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:17.309 "is_configured": true, 00:22:17.309 "data_offset": 256, 00:22:17.309 "data_size": 7936 00:22:17.309 } 00:22:17.309 ] 00:22:17.309 }' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:22:17.309 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # local timeout=861 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.309 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.566 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:17.566 "name": "raid_bdev1", 00:22:17.566 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:17.566 "strip_size_kb": 0, 00:22:17.566 "state": "online", 00:22:17.566 "raid_level": "raid1", 00:22:17.566 "superblock": true, 00:22:17.566 "num_base_bdevs": 2, 00:22:17.566 "num_base_bdevs_discovered": 2, 00:22:17.566 "num_base_bdevs_operational": 2, 00:22:17.566 "process": { 00:22:17.566 "type": "rebuild", 00:22:17.566 "target": "spare", 00:22:17.566 "progress": { 00:22:17.566 "blocks": 3840, 00:22:17.566 "percent": 48 00:22:17.566 } 00:22:17.566 }, 00:22:17.566 "base_bdevs_list": [ 00:22:17.566 { 00:22:17.566 "name": "spare", 00:22:17.566 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:17.566 "is_configured": true, 00:22:17.566 "data_offset": 256, 00:22:17.566 "data_size": 7936 00:22:17.566 }, 00:22:17.566 { 00:22:17.566 "name": "BaseBdev2", 00:22:17.566 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:17.566 "is_configured": true, 00:22:17.566 "data_offset": 256, 00:22:17.566 "data_size": 7936 00:22:17.566 } 00:22:17.566 ] 00:22:17.566 }' 00:22:17.566 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:17.566 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:17.567 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:17.567 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:17.567 04:23:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@711 -- # sleep 1 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:18.939 "name": "raid_bdev1", 00:22:18.939 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:18.939 "strip_size_kb": 0, 00:22:18.939 "state": "online", 00:22:18.939 "raid_level": "raid1", 00:22:18.939 "superblock": true, 00:22:18.939 "num_base_bdevs": 2, 00:22:18.939 "num_base_bdevs_discovered": 2, 00:22:18.939 "num_base_bdevs_operational": 2, 00:22:18.939 "process": { 00:22:18.939 "type": "rebuild", 00:22:18.939 "target": "spare", 00:22:18.939 "progress": { 00:22:18.939 "blocks": 7168, 00:22:18.939 "percent": 90 00:22:18.939 } 00:22:18.939 }, 00:22:18.939 "base_bdevs_list": [ 00:22:18.939 { 00:22:18.939 "name": "spare", 00:22:18.939 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:18.939 "is_configured": true, 00:22:18.939 "data_offset": 256, 00:22:18.939 "data_size": 7936 00:22:18.939 }, 00:22:18.939 { 00:22:18.939 "name": "BaseBdev2", 00:22:18.939 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:18.939 "is_configured": true, 00:22:18.939 "data_offset": 256, 00:22:18.939 "data_size": 7936 00:22:18.939 } 00:22:18.939 ] 00:22:18.939 }' 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:18.939 04:23:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@711 -- # sleep 1 00:22:19.197 [2024-05-15 04:23:07.009978] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:19.197 [2024-05-15 04:23:07.010043] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:19.197 [2024-05-15 04:23:07.010171] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.211 04:23:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.211 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:20.211 "name": "raid_bdev1", 00:22:20.211 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:20.211 "strip_size_kb": 0, 00:22:20.211 "state": "online", 00:22:20.211 "raid_level": "raid1", 00:22:20.211 "superblock": true, 00:22:20.211 "num_base_bdevs": 2, 00:22:20.211 "num_base_bdevs_discovered": 2, 00:22:20.211 "num_base_bdevs_operational": 2, 00:22:20.211 "base_bdevs_list": [ 00:22:20.211 { 00:22:20.211 "name": "spare", 00:22:20.211 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:20.211 "is_configured": true, 00:22:20.211 "data_offset": 256, 00:22:20.211 "data_size": 7936 00:22:20.211 }, 00:22:20.211 { 00:22:20.211 "name": "BaseBdev2", 00:22:20.211 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:20.211 "is_configured": true, 00:22:20.211 "data_offset": 256, 00:22:20.212 "data_size": 7936 00:22:20.212 } 00:22:20.212 ] 00:22:20.212 }' 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@709 -- # break 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.212 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.470 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:20.470 "name": "raid_bdev1", 00:22:20.470 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:20.470 "strip_size_kb": 0, 00:22:20.470 "state": "online", 00:22:20.470 "raid_level": "raid1", 00:22:20.470 "superblock": true, 00:22:20.470 "num_base_bdevs": 2, 00:22:20.470 "num_base_bdevs_discovered": 2, 00:22:20.470 "num_base_bdevs_operational": 2, 00:22:20.470 "base_bdevs_list": [ 00:22:20.470 { 00:22:20.470 "name": "spare", 00:22:20.470 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:20.470 "is_configured": true, 00:22:20.470 "data_offset": 256, 00:22:20.470 "data_size": 7936 00:22:20.470 }, 00:22:20.470 { 00:22:20.470 "name": "BaseBdev2", 00:22:20.470 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:20.470 "is_configured": true, 00:22:20.470 "data_offset": 256, 00:22:20.470 "data_size": 7936 00:22:20.470 } 00:22:20.470 ] 00:22:20.470 }' 00:22:20.470 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.727 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.985 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:20.985 "name": "raid_bdev1", 00:22:20.985 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:20.985 "strip_size_kb": 0, 00:22:20.985 "state": "online", 00:22:20.985 "raid_level": "raid1", 00:22:20.985 "superblock": true, 00:22:20.985 "num_base_bdevs": 2, 00:22:20.985 "num_base_bdevs_discovered": 2, 00:22:20.985 "num_base_bdevs_operational": 2, 00:22:20.985 "base_bdevs_list": [ 00:22:20.985 { 00:22:20.985 "name": "spare", 00:22:20.985 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:20.985 "is_configured": true, 00:22:20.985 "data_offset": 256, 00:22:20.985 "data_size": 7936 00:22:20.985 }, 00:22:20.985 { 00:22:20.985 "name": "BaseBdev2", 00:22:20.985 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:20.985 "is_configured": true, 00:22:20.985 "data_offset": 256, 00:22:20.985 "data_size": 7936 00:22:20.985 } 00:22:20.985 ] 00:22:20.985 }' 00:22:20.985 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:20.985 04:23:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:21.550 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:21.550 [2024-05-15 04:23:09.539062] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:21.550 [2024-05-15 04:23:09.539095] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:21.550 [2024-05-15 04:23:09.539180] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:21.550 [2024-05-15 04:23:09.539252] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:21.550 [2024-05-15 04:23:09.539269] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c9930 name raid_bdev1, state offline 00:22:21.550 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.550 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@720 -- # jq length 00:22:21.807 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:21.808 04:23:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:22.066 /dev/nbd0 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:22.066 1+0 records in 00:22:22.066 1+0 records out 00:22:22.066 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000170336 s, 24.0 MB/s 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:22.066 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:22.323 /dev/nbd1 00:22:22.323 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:22.323 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@865 -- # local i 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # break 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:22.581 1+0 records in 00:22:22.581 1+0 records out 00:22:22.581 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000244372 s, 16.8 MB/s 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # size=4096 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # return 0 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:22.581 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:22.839 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:22:23.097 04:23:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:23.355 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:23.613 [2024-05-15 04:23:11.463439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:23.613 [2024-05-15 04:23:11.463500] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:23.613 [2024-05-15 04:23:11.463527] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15193a0 00:22:23.613 [2024-05-15 04:23:11.463543] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:23.613 [2024-05-15 04:23:11.465330] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:23.613 [2024-05-15 04:23:11.465360] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:23.613 [2024-05-15 04:23:11.465464] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:23.613 [2024-05-15 04:23:11.465507] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:23.613 [2024-05-15 04:23:11.465644] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:23.613 spare 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.613 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.613 [2024-05-15 04:23:11.565980] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x16ca770 00:22:23.613 [2024-05-15 04:23:11.566000] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:23.613 [2024-05-15 04:23:11.566178] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c5050 00:22:23.613 [2024-05-15 04:23:11.566365] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16ca770 00:22:23.613 [2024-05-15 04:23:11.566383] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16ca770 00:22:23.613 [2024-05-15 04:23:11.566502] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:23.871 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:23.871 "name": "raid_bdev1", 00:22:23.871 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:23.871 "strip_size_kb": 0, 00:22:23.871 "state": "online", 00:22:23.871 "raid_level": "raid1", 00:22:23.871 "superblock": true, 00:22:23.871 "num_base_bdevs": 2, 00:22:23.871 "num_base_bdevs_discovered": 2, 00:22:23.871 "num_base_bdevs_operational": 2, 00:22:23.871 "base_bdevs_list": [ 00:22:23.871 { 00:22:23.871 "name": "spare", 00:22:23.871 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:23.871 "is_configured": true, 00:22:23.871 "data_offset": 256, 00:22:23.871 "data_size": 7936 00:22:23.871 }, 00:22:23.871 { 00:22:23.871 "name": "BaseBdev2", 00:22:23.871 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:23.871 "is_configured": true, 00:22:23.871 "data_offset": 256, 00:22:23.871 "data_size": 7936 00:22:23.871 } 00:22:23.871 ] 00:22:23.871 }' 00:22:23.871 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:23.871 04:23:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.437 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:24.695 "name": "raid_bdev1", 00:22:24.695 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:24.695 "strip_size_kb": 0, 00:22:24.695 "state": "online", 00:22:24.695 "raid_level": "raid1", 00:22:24.695 "superblock": true, 00:22:24.695 "num_base_bdevs": 2, 00:22:24.695 "num_base_bdevs_discovered": 2, 00:22:24.695 "num_base_bdevs_operational": 2, 00:22:24.695 "base_bdevs_list": [ 00:22:24.695 { 00:22:24.695 "name": "spare", 00:22:24.695 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:24.695 "is_configured": true, 00:22:24.695 "data_offset": 256, 00:22:24.695 "data_size": 7936 00:22:24.695 }, 00:22:24.695 { 00:22:24.695 "name": "BaseBdev2", 00:22:24.695 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:24.695 "is_configured": true, 00:22:24.695 "data_offset": 256, 00:22:24.695 "data_size": 7936 00:22:24.695 } 00:22:24.695 ] 00:22:24.695 }' 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.695 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:24.954 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:22:24.954 04:23:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:25.212 [2024-05-15 04:23:13.067816] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.212 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.470 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:25.470 "name": "raid_bdev1", 00:22:25.470 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:25.470 "strip_size_kb": 0, 00:22:25.470 "state": "online", 00:22:25.470 "raid_level": "raid1", 00:22:25.470 "superblock": true, 00:22:25.470 "num_base_bdevs": 2, 00:22:25.470 "num_base_bdevs_discovered": 1, 00:22:25.470 "num_base_bdevs_operational": 1, 00:22:25.470 "base_bdevs_list": [ 00:22:25.470 { 00:22:25.470 "name": null, 00:22:25.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:25.470 "is_configured": false, 00:22:25.470 "data_offset": 256, 00:22:25.470 "data_size": 7936 00:22:25.470 }, 00:22:25.470 { 00:22:25.470 "name": "BaseBdev2", 00:22:25.470 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:25.470 "is_configured": true, 00:22:25.470 "data_offset": 256, 00:22:25.470 "data_size": 7936 00:22:25.470 } 00:22:25.470 ] 00:22:25.470 }' 00:22:25.470 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:25.470 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:26.035 04:23:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:26.293 [2024-05-15 04:23:14.098573] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:26.293 [2024-05-15 04:23:14.098785] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:26.293 [2024-05-15 04:23:14.098807] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:26.293 [2024-05-15 04:23:14.098851] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:26.293 [2024-05-15 04:23:14.105682] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c7ed0 00:22:26.293 [2024-05-15 04:23:14.107998] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:26.293 04:23:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # sleep 1 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:27.227 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.228 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:27.486 "name": "raid_bdev1", 00:22:27.486 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:27.486 "strip_size_kb": 0, 00:22:27.486 "state": "online", 00:22:27.486 "raid_level": "raid1", 00:22:27.486 "superblock": true, 00:22:27.486 "num_base_bdevs": 2, 00:22:27.486 "num_base_bdevs_discovered": 2, 00:22:27.486 "num_base_bdevs_operational": 2, 00:22:27.486 "process": { 00:22:27.486 "type": "rebuild", 00:22:27.486 "target": "spare", 00:22:27.486 "progress": { 00:22:27.486 "blocks": 3072, 00:22:27.486 "percent": 38 00:22:27.486 } 00:22:27.486 }, 00:22:27.486 "base_bdevs_list": [ 00:22:27.486 { 00:22:27.486 "name": "spare", 00:22:27.486 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:27.486 "is_configured": true, 00:22:27.486 "data_offset": 256, 00:22:27.486 "data_size": 7936 00:22:27.486 }, 00:22:27.486 { 00:22:27.486 "name": "BaseBdev2", 00:22:27.486 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:27.486 "is_configured": true, 00:22:27.486 "data_offset": 256, 00:22:27.486 "data_size": 7936 00:22:27.486 } 00:22:27.486 ] 00:22:27.486 }' 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:27.486 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:27.744 [2024-05-15 04:23:15.709946] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:27.744 [2024-05-15 04:23:15.720924] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:27.744 [2024-05-15 04:23:15.720978] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.744 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.002 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:28.002 "name": "raid_bdev1", 00:22:28.002 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:28.002 "strip_size_kb": 0, 00:22:28.002 "state": "online", 00:22:28.002 "raid_level": "raid1", 00:22:28.002 "superblock": true, 00:22:28.002 "num_base_bdevs": 2, 00:22:28.002 "num_base_bdevs_discovered": 1, 00:22:28.002 "num_base_bdevs_operational": 1, 00:22:28.002 "base_bdevs_list": [ 00:22:28.002 { 00:22:28.002 "name": null, 00:22:28.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.002 "is_configured": false, 00:22:28.002 "data_offset": 256, 00:22:28.002 "data_size": 7936 00:22:28.002 }, 00:22:28.002 { 00:22:28.002 "name": "BaseBdev2", 00:22:28.002 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:28.002 "is_configured": true, 00:22:28.002 "data_offset": 256, 00:22:28.002 "data_size": 7936 00:22:28.002 } 00:22:28.002 ] 00:22:28.002 }' 00:22:28.002 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:28.002 04:23:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:28.569 04:23:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:28.827 [2024-05-15 04:23:16.745631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:28.827 [2024-05-15 04:23:16.745697] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.827 [2024-05-15 04:23:16.745733] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c7f90 00:22:28.827 [2024-05-15 04:23:16.745750] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.827 [2024-05-15 04:23:16.746221] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.827 [2024-05-15 04:23:16.746249] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:28.827 [2024-05-15 04:23:16.746357] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:28.827 [2024-05-15 04:23:16.746378] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:28.827 [2024-05-15 04:23:16.746391] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:28.827 [2024-05-15 04:23:16.746415] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:28.827 [2024-05-15 04:23:16.753201] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c5050 00:22:28.827 spare 00:22:28.827 [2024-05-15 04:23:16.754759] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:28.827 04:23:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # sleep 1 00:22:29.760 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:29.760 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:29.760 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:22:29.760 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=spare 00:22:29.760 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:30.019 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.019 04:23:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.019 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:30.019 "name": "raid_bdev1", 00:22:30.019 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:30.019 "strip_size_kb": 0, 00:22:30.019 "state": "online", 00:22:30.019 "raid_level": "raid1", 00:22:30.019 "superblock": true, 00:22:30.019 "num_base_bdevs": 2, 00:22:30.019 "num_base_bdevs_discovered": 2, 00:22:30.019 "num_base_bdevs_operational": 2, 00:22:30.019 "process": { 00:22:30.019 "type": "rebuild", 00:22:30.019 "target": "spare", 00:22:30.019 "progress": { 00:22:30.019 "blocks": 3072, 00:22:30.019 "percent": 38 00:22:30.019 } 00:22:30.019 }, 00:22:30.019 "base_bdevs_list": [ 00:22:30.019 { 00:22:30.019 "name": "spare", 00:22:30.019 "uuid": "2ddb7d4f-079b-560c-9ef8-1b6f3925ec92", 00:22:30.019 "is_configured": true, 00:22:30.019 "data_offset": 256, 00:22:30.019 "data_size": 7936 00:22:30.019 }, 00:22:30.019 { 00:22:30.019 "name": "BaseBdev2", 00:22:30.019 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:30.019 "is_configured": true, 00:22:30.019 "data_offset": 256, 00:22:30.019 "data_size": 7936 00:22:30.019 } 00:22:30.019 ] 00:22:30.019 }' 00:22:30.019 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:30.277 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:30.277 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:30.277 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:22:30.277 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:30.535 [2024-05-15 04:23:18.329646] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:30.535 [2024-05-15 04:23:18.368144] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:30.535 [2024-05-15 04:23:18.368209] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.535 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:30.792 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:30.792 "name": "raid_bdev1", 00:22:30.792 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:30.792 "strip_size_kb": 0, 00:22:30.792 "state": "online", 00:22:30.792 "raid_level": "raid1", 00:22:30.792 "superblock": true, 00:22:30.792 "num_base_bdevs": 2, 00:22:30.792 "num_base_bdevs_discovered": 1, 00:22:30.792 "num_base_bdevs_operational": 1, 00:22:30.792 "base_bdevs_list": [ 00:22:30.792 { 00:22:30.792 "name": null, 00:22:30.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:30.792 "is_configured": false, 00:22:30.793 "data_offset": 256, 00:22:30.793 "data_size": 7936 00:22:30.793 }, 00:22:30.793 { 00:22:30.793 "name": "BaseBdev2", 00:22:30.793 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:30.793 "is_configured": true, 00:22:30.793 "data_offset": 256, 00:22:30.793 "data_size": 7936 00:22:30.793 } 00:22:30.793 ] 00:22:30.793 }' 00:22:30.793 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:30.793 04:23:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.357 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:31.615 "name": "raid_bdev1", 00:22:31.615 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:31.615 "strip_size_kb": 0, 00:22:31.615 "state": "online", 00:22:31.615 "raid_level": "raid1", 00:22:31.615 "superblock": true, 00:22:31.615 "num_base_bdevs": 2, 00:22:31.615 "num_base_bdevs_discovered": 1, 00:22:31.615 "num_base_bdevs_operational": 1, 00:22:31.615 "base_bdevs_list": [ 00:22:31.615 { 00:22:31.615 "name": null, 00:22:31.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:31.615 "is_configured": false, 00:22:31.615 "data_offset": 256, 00:22:31.615 "data_size": 7936 00:22:31.615 }, 00:22:31.615 { 00:22:31.615 "name": "BaseBdev2", 00:22:31.615 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:31.615 "is_configured": true, 00:22:31.615 "data_offset": 256, 00:22:31.615 "data_size": 7936 00:22:31.615 } 00:22:31.615 ] 00:22:31.615 }' 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:31.615 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:31.872 04:23:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:32.138 [2024-05-15 04:23:20.021988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:32.138 [2024-05-15 04:23:20.022052] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.138 [2024-05-15 04:23:20.022082] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16cacb0 00:22:32.138 [2024-05-15 04:23:20.022098] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.138 [2024-05-15 04:23:20.022539] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.138 [2024-05-15 04:23:20.022562] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:32.138 [2024-05-15 04:23:20.022659] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:32.138 [2024-05-15 04:23:20.022677] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:32.138 [2024-05-15 04:23:20.022686] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:32.138 BaseBdev1 00:22:32.138 04:23:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # sleep 1 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.076 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.334 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:33.334 "name": "raid_bdev1", 00:22:33.334 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:33.334 "strip_size_kb": 0, 00:22:33.334 "state": "online", 00:22:33.334 "raid_level": "raid1", 00:22:33.334 "superblock": true, 00:22:33.334 "num_base_bdevs": 2, 00:22:33.334 "num_base_bdevs_discovered": 1, 00:22:33.334 "num_base_bdevs_operational": 1, 00:22:33.334 "base_bdevs_list": [ 00:22:33.334 { 00:22:33.334 "name": null, 00:22:33.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.334 "is_configured": false, 00:22:33.334 "data_offset": 256, 00:22:33.334 "data_size": 7936 00:22:33.334 }, 00:22:33.334 { 00:22:33.334 "name": "BaseBdev2", 00:22:33.334 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:33.334 "is_configured": true, 00:22:33.334 "data_offset": 256, 00:22:33.334 "data_size": 7936 00:22:33.334 } 00:22:33.334 ] 00:22:33.334 }' 00:22:33.334 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:33.334 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.900 04:23:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.158 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:34.158 "name": "raid_bdev1", 00:22:34.158 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:34.158 "strip_size_kb": 0, 00:22:34.158 "state": "online", 00:22:34.158 "raid_level": "raid1", 00:22:34.158 "superblock": true, 00:22:34.158 "num_base_bdevs": 2, 00:22:34.158 "num_base_bdevs_discovered": 1, 00:22:34.158 "num_base_bdevs_operational": 1, 00:22:34.158 "base_bdevs_list": [ 00:22:34.158 { 00:22:34.158 "name": null, 00:22:34.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.158 "is_configured": false, 00:22:34.158 "data_offset": 256, 00:22:34.158 "data_size": 7936 00:22:34.158 }, 00:22:34.158 { 00:22:34.158 "name": "BaseBdev2", 00:22:34.158 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:34.158 "is_configured": true, 00:22:34.158 "data_offset": 256, 00:22:34.158 "data_size": 7936 00:22:34.158 } 00:22:34.158 ] 00:22:34.158 }' 00:22:34.158 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:34.158 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:34.158 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:34.416 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:34.416 [2024-05-15 04:23:22.420392] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:34.416 [2024-05-15 04:23:22.420559] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:34.416 [2024-05-15 04:23:22.420580] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:34.416 request: 00:22:34.416 { 00:22:34.416 "raid_bdev": "raid_bdev1", 00:22:34.416 "base_bdev": "BaseBdev1", 00:22:34.416 "method": "bdev_raid_add_base_bdev", 00:22:34.416 "req_id": 1 00:22:34.416 } 00:22:34.416 Got JSON-RPC error response 00:22:34.416 response: 00:22:34.416 { 00:22:34.416 "code": -22, 00:22:34.416 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:34.416 } 00:22:34.674 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:22:34.674 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:34.674 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:34.674 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:34.674 04:23:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # sleep 1 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.619 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.878 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:35.878 "name": "raid_bdev1", 00:22:35.878 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:35.878 "strip_size_kb": 0, 00:22:35.878 "state": "online", 00:22:35.878 "raid_level": "raid1", 00:22:35.878 "superblock": true, 00:22:35.878 "num_base_bdevs": 2, 00:22:35.878 "num_base_bdevs_discovered": 1, 00:22:35.878 "num_base_bdevs_operational": 1, 00:22:35.878 "base_bdevs_list": [ 00:22:35.878 { 00:22:35.878 "name": null, 00:22:35.878 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:35.878 "is_configured": false, 00:22:35.878 "data_offset": 256, 00:22:35.878 "data_size": 7936 00:22:35.878 }, 00:22:35.878 { 00:22:35.878 "name": "BaseBdev2", 00:22:35.878 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:35.878 "is_configured": true, 00:22:35.878 "data_offset": 256, 00:22:35.878 "data_size": 7936 00:22:35.878 } 00:22:35.878 ] 00:22:35.878 }' 00:22:35.878 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:35.878 04:23:23 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local target=none 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.443 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:22:36.701 "name": "raid_bdev1", 00:22:36.701 "uuid": "08541022-9157-4c93-b122-eece1ef83f5f", 00:22:36.701 "strip_size_kb": 0, 00:22:36.701 "state": "online", 00:22:36.701 "raid_level": "raid1", 00:22:36.701 "superblock": true, 00:22:36.701 "num_base_bdevs": 2, 00:22:36.701 "num_base_bdevs_discovered": 1, 00:22:36.701 "num_base_bdevs_operational": 1, 00:22:36.701 "base_bdevs_list": [ 00:22:36.701 { 00:22:36.701 "name": null, 00:22:36.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.701 "is_configured": false, 00:22:36.701 "data_offset": 256, 00:22:36.701 "data_size": 7936 00:22:36.701 }, 00:22:36.701 { 00:22:36.701 "name": "BaseBdev2", 00:22:36.701 "uuid": "051e4e02-fa47-5d25-9fcd-989b26726b22", 00:22:36.701 "is_configured": true, 00:22:36.701 "data_offset": 256, 00:22:36.701 "data_size": 7936 00:22:36.701 } 00:22:36.701 ] 00:22:36.701 }' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@783 -- # killprocess 3938033 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@946 -- # '[' -z 3938033 ']' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # kill -0 3938033 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # uname 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3938033 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3938033' 00:22:36.701 killing process with pid 3938033 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@965 -- # kill 3938033 00:22:36.701 Received shutdown signal, test time was about 60.000000 seconds 00:22:36.701 00:22:36.701 Latency(us) 00:22:36.701 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:36.701 =================================================================================================================== 00:22:36.701 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:36.701 [2024-05-15 04:23:24.580949] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:36.701 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@970 -- # wait 3938033 00:22:36.701 [2024-05-15 04:23:24.581051] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:36.701 [2024-05-15 04:23:24.581120] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:36.701 [2024-05-15 04:23:24.581136] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16ca770 name raid_bdev1, state offline 00:22:36.701 [2024-05-15 04:23:24.614093] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:36.959 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@785 -- # return 0 00:22:36.959 00:22:36.959 real 0m31.274s 00:22:36.959 user 0m49.545s 00:22:36.959 sys 0m4.056s 00:22:36.959 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:36.959 04:23:24 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:22:36.959 ************************************ 00:22:36.959 END TEST raid_rebuild_test_sb_4k 00:22:36.959 ************************************ 00:22:36.959 04:23:24 bdev_raid -- bdev/bdev_raid.sh@838 -- # base_malloc_params='-m 32' 00:22:36.959 04:23:24 bdev_raid -- bdev/bdev_raid.sh@839 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:22:36.959 04:23:24 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:22:36.959 04:23:24 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:36.959 04:23:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:36.959 ************************************ 00:22:36.959 START TEST raid_state_function_test_sb_md_separate 00:22:36.959 ************************************ 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:22:36.959 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # raid_pid=3942198 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3942198' 00:22:36.960 Process raid pid: 3942198 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@247 -- # waitforlisten 3942198 /var/tmp/spdk-raid.sock 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 3942198 ']' 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:36.960 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:36.960 04:23:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:37.218 [2024-05-15 04:23:25.002768] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:22:37.218 [2024-05-15 04:23:25.002854] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:37.218 [2024-05-15 04:23:25.079307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:37.218 [2024-05-15 04:23:25.189768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:37.475 [2024-05-15 04:23:25.257071] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:37.475 [2024-05-15 04:23:25.257133] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:38.041 04:23:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:38.041 04:23:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:22:38.041 04:23:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:38.299 [2024-05-15 04:23:26.236261] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:38.299 [2024-05-15 04:23:26.236308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:38.299 [2024-05-15 04:23:26.236322] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:38.299 [2024-05-15 04:23:26.236335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.299 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:38.558 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:38.558 "name": "Existed_Raid", 00:22:38.558 "uuid": "eb4494b3-894e-4836-93c1-090bdbd53acd", 00:22:38.558 "strip_size_kb": 0, 00:22:38.558 "state": "configuring", 00:22:38.558 "raid_level": "raid1", 00:22:38.558 "superblock": true, 00:22:38.558 "num_base_bdevs": 2, 00:22:38.558 "num_base_bdevs_discovered": 0, 00:22:38.558 "num_base_bdevs_operational": 2, 00:22:38.558 "base_bdevs_list": [ 00:22:38.558 { 00:22:38.558 "name": "BaseBdev1", 00:22:38.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.558 "is_configured": false, 00:22:38.558 "data_offset": 0, 00:22:38.558 "data_size": 0 00:22:38.558 }, 00:22:38.558 { 00:22:38.558 "name": "BaseBdev2", 00:22:38.558 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:38.558 "is_configured": false, 00:22:38.558 "data_offset": 0, 00:22:38.558 "data_size": 0 00:22:38.558 } 00:22:38.558 ] 00:22:38.558 }' 00:22:38.558 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:38.558 04:23:26 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:39.126 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:39.384 [2024-05-15 04:23:27.302952] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:39.384 [2024-05-15 04:23:27.302983] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169b000 name Existed_Raid, state configuring 00:22:39.384 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:39.642 [2024-05-15 04:23:27.539606] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:39.642 [2024-05-15 04:23:27.539649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:39.642 [2024-05-15 04:23:27.539659] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:39.642 [2024-05-15 04:23:27.539669] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:39.642 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:22:39.900 [2024-05-15 04:23:27.788407] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:39.900 BaseBdev1 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:22:39.900 04:23:27 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:40.195 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:40.478 [ 00:22:40.478 { 00:22:40.478 "name": "BaseBdev1", 00:22:40.478 "aliases": [ 00:22:40.478 "3c4afd44-bfac-4b60-8e24-9d562bcd4f79" 00:22:40.478 ], 00:22:40.478 "product_name": "Malloc disk", 00:22:40.478 "block_size": 4096, 00:22:40.478 "num_blocks": 8192, 00:22:40.478 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:40.478 "md_size": 32, 00:22:40.478 "md_interleave": false, 00:22:40.478 "dif_type": 0, 00:22:40.478 "assigned_rate_limits": { 00:22:40.478 "rw_ios_per_sec": 0, 00:22:40.478 "rw_mbytes_per_sec": 0, 00:22:40.478 "r_mbytes_per_sec": 0, 00:22:40.478 "w_mbytes_per_sec": 0 00:22:40.478 }, 00:22:40.478 "claimed": true, 00:22:40.478 "claim_type": "exclusive_write", 00:22:40.478 "zoned": false, 00:22:40.478 "supported_io_types": { 00:22:40.478 "read": true, 00:22:40.478 "write": true, 00:22:40.478 "unmap": true, 00:22:40.478 "write_zeroes": true, 00:22:40.478 "flush": true, 00:22:40.478 "reset": true, 00:22:40.478 "compare": false, 00:22:40.478 "compare_and_write": false, 00:22:40.478 "abort": true, 00:22:40.478 "nvme_admin": false, 00:22:40.478 "nvme_io": false 00:22:40.478 }, 00:22:40.478 "memory_domains": [ 00:22:40.478 { 00:22:40.478 "dma_device_id": "system", 00:22:40.478 "dma_device_type": 1 00:22:40.478 }, 00:22:40.478 { 00:22:40.478 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:40.478 "dma_device_type": 2 00:22:40.478 } 00:22:40.478 ], 00:22:40.478 "driver_specific": {} 00:22:40.478 } 00:22:40.478 ] 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.478 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:40.736 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:40.736 "name": "Existed_Raid", 00:22:40.736 "uuid": "0c4d7850-b2bc-4103-9d4a-378723fb6f2d", 00:22:40.736 "strip_size_kb": 0, 00:22:40.736 "state": "configuring", 00:22:40.736 "raid_level": "raid1", 00:22:40.736 "superblock": true, 00:22:40.736 "num_base_bdevs": 2, 00:22:40.736 "num_base_bdevs_discovered": 1, 00:22:40.736 "num_base_bdevs_operational": 2, 00:22:40.736 "base_bdevs_list": [ 00:22:40.736 { 00:22:40.736 "name": "BaseBdev1", 00:22:40.736 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:40.736 "is_configured": true, 00:22:40.736 "data_offset": 256, 00:22:40.736 "data_size": 7936 00:22:40.736 }, 00:22:40.736 { 00:22:40.736 "name": "BaseBdev2", 00:22:40.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:40.736 "is_configured": false, 00:22:40.736 "data_offset": 0, 00:22:40.736 "data_size": 0 00:22:40.736 } 00:22:40.736 ] 00:22:40.736 }' 00:22:40.736 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:40.736 04:23:28 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:41.302 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:41.560 [2024-05-15 04:23:29.332714] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:41.560 [2024-05-15 04:23:29.332760] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169a8f0 name Existed_Raid, state configuring 00:22:41.560 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:22:41.818 [2024-05-15 04:23:29.581419] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:41.818 [2024-05-15 04:23:29.582786] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:41.818 [2024-05-15 04:23:29.582816] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.818 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:42.076 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:42.076 "name": "Existed_Raid", 00:22:42.076 "uuid": "7a2b6b40-7b3e-40fe-911a-61f3797bb588", 00:22:42.076 "strip_size_kb": 0, 00:22:42.076 "state": "configuring", 00:22:42.076 "raid_level": "raid1", 00:22:42.076 "superblock": true, 00:22:42.076 "num_base_bdevs": 2, 00:22:42.076 "num_base_bdevs_discovered": 1, 00:22:42.076 "num_base_bdevs_operational": 2, 00:22:42.076 "base_bdevs_list": [ 00:22:42.076 { 00:22:42.076 "name": "BaseBdev1", 00:22:42.076 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:42.076 "is_configured": true, 00:22:42.076 "data_offset": 256, 00:22:42.076 "data_size": 7936 00:22:42.076 }, 00:22:42.076 { 00:22:42.076 "name": "BaseBdev2", 00:22:42.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.076 "is_configured": false, 00:22:42.076 "data_offset": 0, 00:22:42.076 "data_size": 0 00:22:42.076 } 00:22:42.076 ] 00:22:42.076 }' 00:22:42.076 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:42.076 04:23:29 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:22:42.642 [2024-05-15 04:23:30.633297] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:42.642 [2024-05-15 04:23:30.633499] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x169c990 00:22:42.642 [2024-05-15 04:23:30.633514] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:42.642 [2024-05-15 04:23:30.633567] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169c3d0 00:22:42.642 [2024-05-15 04:23:30.633666] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169c990 00:22:42.642 [2024-05-15 04:23:30.633680] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x169c990 00:22:42.642 [2024-05-15 04:23:30.633751] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.642 BaseBdev2 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local i 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:22:42.642 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:42.900 04:23:30 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:43.158 [ 00:22:43.158 { 00:22:43.158 "name": "BaseBdev2", 00:22:43.158 "aliases": [ 00:22:43.158 "423f92f3-eb90-47ec-8050-8267a5dacac7" 00:22:43.158 ], 00:22:43.158 "product_name": "Malloc disk", 00:22:43.158 "block_size": 4096, 00:22:43.158 "num_blocks": 8192, 00:22:43.158 "uuid": "423f92f3-eb90-47ec-8050-8267a5dacac7", 00:22:43.158 "md_size": 32, 00:22:43.158 "md_interleave": false, 00:22:43.158 "dif_type": 0, 00:22:43.158 "assigned_rate_limits": { 00:22:43.158 "rw_ios_per_sec": 0, 00:22:43.158 "rw_mbytes_per_sec": 0, 00:22:43.158 "r_mbytes_per_sec": 0, 00:22:43.158 "w_mbytes_per_sec": 0 00:22:43.158 }, 00:22:43.158 "claimed": true, 00:22:43.158 "claim_type": "exclusive_write", 00:22:43.158 "zoned": false, 00:22:43.158 "supported_io_types": { 00:22:43.158 "read": true, 00:22:43.158 "write": true, 00:22:43.158 "unmap": true, 00:22:43.158 "write_zeroes": true, 00:22:43.158 "flush": true, 00:22:43.158 "reset": true, 00:22:43.158 "compare": false, 00:22:43.158 "compare_and_write": false, 00:22:43.158 "abort": true, 00:22:43.158 "nvme_admin": false, 00:22:43.158 "nvme_io": false 00:22:43.158 }, 00:22:43.158 "memory_domains": [ 00:22:43.158 { 00:22:43.158 "dma_device_id": "system", 00:22:43.158 "dma_device_type": 1 00:22:43.158 }, 00:22:43.158 { 00:22:43.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.158 "dma_device_type": 2 00:22:43.158 } 00:22:43.158 ], 00:22:43.158 "driver_specific": {} 00:22:43.158 } 00:22:43.158 ] 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # return 0 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.158 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:43.416 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:43.416 "name": "Existed_Raid", 00:22:43.416 "uuid": "7a2b6b40-7b3e-40fe-911a-61f3797bb588", 00:22:43.416 "strip_size_kb": 0, 00:22:43.416 "state": "online", 00:22:43.416 "raid_level": "raid1", 00:22:43.416 "superblock": true, 00:22:43.416 "num_base_bdevs": 2, 00:22:43.416 "num_base_bdevs_discovered": 2, 00:22:43.416 "num_base_bdevs_operational": 2, 00:22:43.416 "base_bdevs_list": [ 00:22:43.416 { 00:22:43.416 "name": "BaseBdev1", 00:22:43.416 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:43.416 "is_configured": true, 00:22:43.416 "data_offset": 256, 00:22:43.416 "data_size": 7936 00:22:43.416 }, 00:22:43.416 { 00:22:43.416 "name": "BaseBdev2", 00:22:43.416 "uuid": "423f92f3-eb90-47ec-8050-8267a5dacac7", 00:22:43.416 "is_configured": true, 00:22:43.416 "data_offset": 256, 00:22:43.416 "data_size": 7936 00:22:43.416 } 00:22:43.416 ] 00:22:43.416 }' 00:22:43.416 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:43.416 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:43.981 04:23:31 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:22:44.239 [2024-05-15 04:23:32.129607] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:22:44.239 "name": "Existed_Raid", 00:22:44.239 "aliases": [ 00:22:44.239 "7a2b6b40-7b3e-40fe-911a-61f3797bb588" 00:22:44.239 ], 00:22:44.239 "product_name": "Raid Volume", 00:22:44.239 "block_size": 4096, 00:22:44.239 "num_blocks": 7936, 00:22:44.239 "uuid": "7a2b6b40-7b3e-40fe-911a-61f3797bb588", 00:22:44.239 "md_size": 32, 00:22:44.239 "md_interleave": false, 00:22:44.239 "dif_type": 0, 00:22:44.239 "assigned_rate_limits": { 00:22:44.239 "rw_ios_per_sec": 0, 00:22:44.239 "rw_mbytes_per_sec": 0, 00:22:44.239 "r_mbytes_per_sec": 0, 00:22:44.239 "w_mbytes_per_sec": 0 00:22:44.239 }, 00:22:44.239 "claimed": false, 00:22:44.239 "zoned": false, 00:22:44.239 "supported_io_types": { 00:22:44.239 "read": true, 00:22:44.239 "write": true, 00:22:44.239 "unmap": false, 00:22:44.239 "write_zeroes": true, 00:22:44.239 "flush": false, 00:22:44.239 "reset": true, 00:22:44.239 "compare": false, 00:22:44.239 "compare_and_write": false, 00:22:44.239 "abort": false, 00:22:44.239 "nvme_admin": false, 00:22:44.239 "nvme_io": false 00:22:44.239 }, 00:22:44.239 "memory_domains": [ 00:22:44.239 { 00:22:44.239 "dma_device_id": "system", 00:22:44.239 "dma_device_type": 1 00:22:44.239 }, 00:22:44.239 { 00:22:44.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.239 "dma_device_type": 2 00:22:44.239 }, 00:22:44.239 { 00:22:44.239 "dma_device_id": "system", 00:22:44.239 "dma_device_type": 1 00:22:44.239 }, 00:22:44.239 { 00:22:44.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.239 "dma_device_type": 2 00:22:44.239 } 00:22:44.239 ], 00:22:44.239 "driver_specific": { 00:22:44.239 "raid": { 00:22:44.239 "uuid": "7a2b6b40-7b3e-40fe-911a-61f3797bb588", 00:22:44.239 "strip_size_kb": 0, 00:22:44.239 "state": "online", 00:22:44.239 "raid_level": "raid1", 00:22:44.239 "superblock": true, 00:22:44.239 "num_base_bdevs": 2, 00:22:44.239 "num_base_bdevs_discovered": 2, 00:22:44.239 "num_base_bdevs_operational": 2, 00:22:44.239 "base_bdevs_list": [ 00:22:44.239 { 00:22:44.239 "name": "BaseBdev1", 00:22:44.239 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:44.239 "is_configured": true, 00:22:44.239 "data_offset": 256, 00:22:44.239 "data_size": 7936 00:22:44.239 }, 00:22:44.239 { 00:22:44.239 "name": "BaseBdev2", 00:22:44.239 "uuid": "423f92f3-eb90-47ec-8050-8267a5dacac7", 00:22:44.239 "is_configured": true, 00:22:44.239 "data_offset": 256, 00:22:44.239 "data_size": 7936 00:22:44.239 } 00:22:44.239 ] 00:22:44.239 } 00:22:44.239 } 00:22:44.239 }' 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:22:44.239 BaseBdev2' 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:44.239 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:44.497 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:44.497 "name": "BaseBdev1", 00:22:44.497 "aliases": [ 00:22:44.497 "3c4afd44-bfac-4b60-8e24-9d562bcd4f79" 00:22:44.497 ], 00:22:44.497 "product_name": "Malloc disk", 00:22:44.497 "block_size": 4096, 00:22:44.497 "num_blocks": 8192, 00:22:44.497 "uuid": "3c4afd44-bfac-4b60-8e24-9d562bcd4f79", 00:22:44.497 "md_size": 32, 00:22:44.497 "md_interleave": false, 00:22:44.497 "dif_type": 0, 00:22:44.497 "assigned_rate_limits": { 00:22:44.497 "rw_ios_per_sec": 0, 00:22:44.497 "rw_mbytes_per_sec": 0, 00:22:44.497 "r_mbytes_per_sec": 0, 00:22:44.497 "w_mbytes_per_sec": 0 00:22:44.497 }, 00:22:44.497 "claimed": true, 00:22:44.497 "claim_type": "exclusive_write", 00:22:44.497 "zoned": false, 00:22:44.497 "supported_io_types": { 00:22:44.497 "read": true, 00:22:44.497 "write": true, 00:22:44.497 "unmap": true, 00:22:44.497 "write_zeroes": true, 00:22:44.497 "flush": true, 00:22:44.497 "reset": true, 00:22:44.497 "compare": false, 00:22:44.497 "compare_and_write": false, 00:22:44.497 "abort": true, 00:22:44.497 "nvme_admin": false, 00:22:44.497 "nvme_io": false 00:22:44.497 }, 00:22:44.497 "memory_domains": [ 00:22:44.497 { 00:22:44.497 "dma_device_id": "system", 00:22:44.497 "dma_device_type": 1 00:22:44.498 }, 00:22:44.498 { 00:22:44.498 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.498 "dma_device_type": 2 00:22:44.498 } 00:22:44.498 ], 00:22:44.498 "driver_specific": {} 00:22:44.498 }' 00:22:44.498 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:44.498 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:44.498 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:44.498 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:44.756 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:45.014 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:45.014 "name": "BaseBdev2", 00:22:45.014 "aliases": [ 00:22:45.014 "423f92f3-eb90-47ec-8050-8267a5dacac7" 00:22:45.014 ], 00:22:45.014 "product_name": "Malloc disk", 00:22:45.014 "block_size": 4096, 00:22:45.014 "num_blocks": 8192, 00:22:45.014 "uuid": "423f92f3-eb90-47ec-8050-8267a5dacac7", 00:22:45.014 "md_size": 32, 00:22:45.014 "md_interleave": false, 00:22:45.014 "dif_type": 0, 00:22:45.014 "assigned_rate_limits": { 00:22:45.014 "rw_ios_per_sec": 0, 00:22:45.014 "rw_mbytes_per_sec": 0, 00:22:45.014 "r_mbytes_per_sec": 0, 00:22:45.014 "w_mbytes_per_sec": 0 00:22:45.014 }, 00:22:45.014 "claimed": true, 00:22:45.014 "claim_type": "exclusive_write", 00:22:45.014 "zoned": false, 00:22:45.014 "supported_io_types": { 00:22:45.014 "read": true, 00:22:45.014 "write": true, 00:22:45.014 "unmap": true, 00:22:45.014 "write_zeroes": true, 00:22:45.014 "flush": true, 00:22:45.014 "reset": true, 00:22:45.014 "compare": false, 00:22:45.014 "compare_and_write": false, 00:22:45.014 "abort": true, 00:22:45.014 "nvme_admin": false, 00:22:45.014 "nvme_io": false 00:22:45.014 }, 00:22:45.014 "memory_domains": [ 00:22:45.014 { 00:22:45.014 "dma_device_id": "system", 00:22:45.014 "dma_device_type": 1 00:22:45.014 }, 00:22:45.014 { 00:22:45.014 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.014 "dma_device_type": 2 00:22:45.014 } 00:22:45.014 ], 00:22:45.014 "driver_specific": {} 00:22:45.014 }' 00:22:45.014 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:45.014 04:23:32 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:45.014 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:45.014 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:45.272 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:45.273 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:45.273 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:45.531 [2024-05-15 04:23:33.425000] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # local expected_state 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.531 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:45.789 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:45.789 "name": "Existed_Raid", 00:22:45.789 "uuid": "7a2b6b40-7b3e-40fe-911a-61f3797bb588", 00:22:45.789 "strip_size_kb": 0, 00:22:45.789 "state": "online", 00:22:45.789 "raid_level": "raid1", 00:22:45.789 "superblock": true, 00:22:45.789 "num_base_bdevs": 2, 00:22:45.789 "num_base_bdevs_discovered": 1, 00:22:45.789 "num_base_bdevs_operational": 1, 00:22:45.789 "base_bdevs_list": [ 00:22:45.789 { 00:22:45.789 "name": null, 00:22:45.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.789 "is_configured": false, 00:22:45.789 "data_offset": 256, 00:22:45.789 "data_size": 7936 00:22:45.789 }, 00:22:45.789 { 00:22:45.789 "name": "BaseBdev2", 00:22:45.789 "uuid": "423f92f3-eb90-47ec-8050-8267a5dacac7", 00:22:45.789 "is_configured": true, 00:22:45.789 "data_offset": 256, 00:22:45.789 "data_size": 7936 00:22:45.789 } 00:22:45.789 ] 00:22:45.789 }' 00:22:45.789 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:45.789 04:23:33 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:46.354 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:22:46.354 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:22:46.354 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.354 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:22:46.612 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:22:46.612 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:46.612 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:46.870 [2024-05-15 04:23:34.669225] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:46.870 [2024-05-15 04:23:34.669337] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:46.870 [2024-05-15 04:23:34.683718] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:46.870 [2024-05-15 04:23:34.683788] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:46.870 [2024-05-15 04:23:34.683801] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169c990 name Existed_Raid, state offline 00:22:46.870 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:22:46.870 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:22:46.870 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.870 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@342 -- # killprocess 3942198 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 3942198 ']' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 3942198 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3942198 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3942198' 00:22:47.127 killing process with pid 3942198 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 3942198 00:22:47.127 04:23:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 3942198 00:22:47.127 [2024-05-15 04:23:34.968257] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:47.127 [2024-05-15 04:23:34.969293] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:47.384 04:23:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@344 -- # return 0 00:22:47.384 00:22:47.384 real 0m10.272s 00:22:47.384 user 0m18.593s 00:22:47.384 sys 0m1.460s 00:22:47.384 04:23:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:22:47.384 04:23:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:47.384 ************************************ 00:22:47.384 END TEST raid_state_function_test_sb_md_separate 00:22:47.384 ************************************ 00:22:47.384 04:23:35 bdev_raid -- bdev/bdev_raid.sh@840 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:22:47.384 04:23:35 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:22:47.384 04:23:35 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:22:47.384 04:23:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:47.384 ************************************ 00:22:47.384 START TEST raid_superblock_test_md_separate 00:22:47.384 ************************************ 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:22:47.384 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # raid_pid=3943623 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # waitforlisten 3943623 /var/tmp/spdk-raid.sock 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@827 -- # '[' -z 3943623 ']' 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:47.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:22:47.385 04:23:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:47.385 [2024-05-15 04:23:35.335720] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:22:47.385 [2024-05-15 04:23:35.335802] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3943623 ] 00:22:47.641 [2024-05-15 04:23:35.418207] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:47.641 [2024-05-15 04:23:35.533491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:22:47.641 [2024-05-15 04:23:35.610254] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:47.641 [2024-05-15 04:23:35.610308] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # return 0 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:22:48.573 malloc1 00:22:48.573 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:48.829 [2024-05-15 04:23:36.723034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:48.829 [2024-05-15 04:23:36.723104] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.829 [2024-05-15 04:23:36.723132] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda6770 00:22:48.829 [2024-05-15 04:23:36.723149] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.829 [2024-05-15 04:23:36.724473] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.829 [2024-05-15 04:23:36.724502] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:48.829 pt1 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:48.829 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:22:49.086 malloc2 00:22:49.086 04:23:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:49.344 [2024-05-15 04:23:37.268488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:49.344 [2024-05-15 04:23:37.268560] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.344 [2024-05-15 04:23:37.268592] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xefd4a0 00:22:49.344 [2024-05-15 04:23:37.268617] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.344 [2024-05-15 04:23:37.270328] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.344 [2024-05-15 04:23:37.270358] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:49.344 pt2 00:22:49.344 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:22:49.344 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:22:49.344 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:22:49.601 [2024-05-15 04:23:37.513144] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:49.601 [2024-05-15 04:23:37.514381] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:49.601 [2024-05-15 04:23:37.514564] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xefee00 00:22:49.601 [2024-05-15 04:23:37.514583] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:49.601 [2024-05-15 04:23:37.514663] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf40860 00:22:49.601 [2024-05-15 04:23:37.514801] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefee00 00:22:49.601 [2024-05-15 04:23:37.514820] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefee00 00:22:49.601 [2024-05-15 04:23:37.514927] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.601 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.859 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:49.859 "name": "raid_bdev1", 00:22:49.859 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:49.859 "strip_size_kb": 0, 00:22:49.859 "state": "online", 00:22:49.859 "raid_level": "raid1", 00:22:49.859 "superblock": true, 00:22:49.859 "num_base_bdevs": 2, 00:22:49.859 "num_base_bdevs_discovered": 2, 00:22:49.859 "num_base_bdevs_operational": 2, 00:22:49.859 "base_bdevs_list": [ 00:22:49.859 { 00:22:49.859 "name": "pt1", 00:22:49.859 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:49.859 "is_configured": true, 00:22:49.859 "data_offset": 256, 00:22:49.859 "data_size": 7936 00:22:49.859 }, 00:22:49.859 { 00:22:49.859 "name": "pt2", 00:22:49.859 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:49.859 "is_configured": true, 00:22:49.859 "data_offset": 256, 00:22:49.859 "data_size": 7936 00:22:49.859 } 00:22:49.859 ] 00:22:49.859 }' 00:22:49.859 04:23:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:49.859 04:23:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:50.424 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:22:50.682 [2024-05-15 04:23:38.540123] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:22:50.682 "name": "raid_bdev1", 00:22:50.682 "aliases": [ 00:22:50.682 "bc205998-8555-46d8-a499-21f348c367e2" 00:22:50.682 ], 00:22:50.682 "product_name": "Raid Volume", 00:22:50.682 "block_size": 4096, 00:22:50.682 "num_blocks": 7936, 00:22:50.682 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:50.682 "md_size": 32, 00:22:50.682 "md_interleave": false, 00:22:50.682 "dif_type": 0, 00:22:50.682 "assigned_rate_limits": { 00:22:50.682 "rw_ios_per_sec": 0, 00:22:50.682 "rw_mbytes_per_sec": 0, 00:22:50.682 "r_mbytes_per_sec": 0, 00:22:50.682 "w_mbytes_per_sec": 0 00:22:50.682 }, 00:22:50.682 "claimed": false, 00:22:50.682 "zoned": false, 00:22:50.682 "supported_io_types": { 00:22:50.682 "read": true, 00:22:50.682 "write": true, 00:22:50.682 "unmap": false, 00:22:50.682 "write_zeroes": true, 00:22:50.682 "flush": false, 00:22:50.682 "reset": true, 00:22:50.682 "compare": false, 00:22:50.682 "compare_and_write": false, 00:22:50.682 "abort": false, 00:22:50.682 "nvme_admin": false, 00:22:50.682 "nvme_io": false 00:22:50.682 }, 00:22:50.682 "memory_domains": [ 00:22:50.682 { 00:22:50.682 "dma_device_id": "system", 00:22:50.682 "dma_device_type": 1 00:22:50.682 }, 00:22:50.682 { 00:22:50.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.682 "dma_device_type": 2 00:22:50.682 }, 00:22:50.682 { 00:22:50.682 "dma_device_id": "system", 00:22:50.682 "dma_device_type": 1 00:22:50.682 }, 00:22:50.682 { 00:22:50.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.682 "dma_device_type": 2 00:22:50.682 } 00:22:50.682 ], 00:22:50.682 "driver_specific": { 00:22:50.682 "raid": { 00:22:50.682 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:50.682 "strip_size_kb": 0, 00:22:50.682 "state": "online", 00:22:50.682 "raid_level": "raid1", 00:22:50.682 "superblock": true, 00:22:50.682 "num_base_bdevs": 2, 00:22:50.682 "num_base_bdevs_discovered": 2, 00:22:50.682 "num_base_bdevs_operational": 2, 00:22:50.682 "base_bdevs_list": [ 00:22:50.682 { 00:22:50.682 "name": "pt1", 00:22:50.682 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:50.682 "is_configured": true, 00:22:50.682 "data_offset": 256, 00:22:50.682 "data_size": 7936 00:22:50.682 }, 00:22:50.682 { 00:22:50.682 "name": "pt2", 00:22:50.682 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:50.682 "is_configured": true, 00:22:50.682 "data_offset": 256, 00:22:50.682 "data_size": 7936 00:22:50.682 } 00:22:50.682 ] 00:22:50.682 } 00:22:50.682 } 00:22:50.682 }' 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:22:50.682 pt2' 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:50.682 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:50.940 "name": "pt1", 00:22:50.940 "aliases": [ 00:22:50.940 "4035852f-6c8a-58eb-9cff-6b28e3a3cf90" 00:22:50.940 ], 00:22:50.940 "product_name": "passthru", 00:22:50.940 "block_size": 4096, 00:22:50.940 "num_blocks": 8192, 00:22:50.940 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:50.940 "md_size": 32, 00:22:50.940 "md_interleave": false, 00:22:50.940 "dif_type": 0, 00:22:50.940 "assigned_rate_limits": { 00:22:50.940 "rw_ios_per_sec": 0, 00:22:50.940 "rw_mbytes_per_sec": 0, 00:22:50.940 "r_mbytes_per_sec": 0, 00:22:50.940 "w_mbytes_per_sec": 0 00:22:50.940 }, 00:22:50.940 "claimed": true, 00:22:50.940 "claim_type": "exclusive_write", 00:22:50.940 "zoned": false, 00:22:50.940 "supported_io_types": { 00:22:50.940 "read": true, 00:22:50.940 "write": true, 00:22:50.940 "unmap": true, 00:22:50.940 "write_zeroes": true, 00:22:50.940 "flush": true, 00:22:50.940 "reset": true, 00:22:50.940 "compare": false, 00:22:50.940 "compare_and_write": false, 00:22:50.940 "abort": true, 00:22:50.940 "nvme_admin": false, 00:22:50.940 "nvme_io": false 00:22:50.940 }, 00:22:50.940 "memory_domains": [ 00:22:50.940 { 00:22:50.940 "dma_device_id": "system", 00:22:50.940 "dma_device_type": 1 00:22:50.940 }, 00:22:50.940 { 00:22:50.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.940 "dma_device_type": 2 00:22:50.940 } 00:22:50.940 ], 00:22:50.940 "driver_specific": { 00:22:50.940 "passthru": { 00:22:50.940 "name": "pt1", 00:22:50.940 "base_bdev_name": "malloc1" 00:22:50.940 } 00:22:50.940 } 00:22:50.940 }' 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:50.940 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:51.197 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:51.197 04:23:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:51.197 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:51.456 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:51.456 "name": "pt2", 00:22:51.456 "aliases": [ 00:22:51.456 "7ea06c8e-c070-5d55-8b54-ee6134f6ec09" 00:22:51.456 ], 00:22:51.456 "product_name": "passthru", 00:22:51.456 "block_size": 4096, 00:22:51.456 "num_blocks": 8192, 00:22:51.456 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:51.456 "md_size": 32, 00:22:51.456 "md_interleave": false, 00:22:51.456 "dif_type": 0, 00:22:51.456 "assigned_rate_limits": { 00:22:51.456 "rw_ios_per_sec": 0, 00:22:51.456 "rw_mbytes_per_sec": 0, 00:22:51.456 "r_mbytes_per_sec": 0, 00:22:51.456 "w_mbytes_per_sec": 0 00:22:51.456 }, 00:22:51.456 "claimed": true, 00:22:51.456 "claim_type": "exclusive_write", 00:22:51.456 "zoned": false, 00:22:51.456 "supported_io_types": { 00:22:51.456 "read": true, 00:22:51.456 "write": true, 00:22:51.456 "unmap": true, 00:22:51.456 "write_zeroes": true, 00:22:51.456 "flush": true, 00:22:51.456 "reset": true, 00:22:51.456 "compare": false, 00:22:51.456 "compare_and_write": false, 00:22:51.456 "abort": true, 00:22:51.456 "nvme_admin": false, 00:22:51.456 "nvme_io": false 00:22:51.456 }, 00:22:51.456 "memory_domains": [ 00:22:51.456 { 00:22:51.456 "dma_device_id": "system", 00:22:51.456 "dma_device_type": 1 00:22:51.456 }, 00:22:51.456 { 00:22:51.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.456 "dma_device_type": 2 00:22:51.456 } 00:22:51.456 ], 00:22:51.456 "driver_specific": { 00:22:51.456 "passthru": { 00:22:51.456 "name": "pt2", 00:22:51.456 "base_bdev_name": "malloc2" 00:22:51.456 } 00:22:51.456 } 00:22:51.456 }' 00:22:51.456 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:51.456 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:51.456 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:51.456 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:51.714 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:22:51.972 [2024-05-15 04:23:39.887737] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:51.972 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=bc205998-8555-46d8-a499-21f348c367e2 00:22:51.972 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # '[' -z bc205998-8555-46d8-a499-21f348c367e2 ']' 00:22:51.972 04:23:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:52.229 [2024-05-15 04:23:40.140189] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:52.229 [2024-05-15 04:23:40.140223] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:52.229 [2024-05-15 04:23:40.140313] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:52.229 [2024-05-15 04:23:40.140384] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:52.229 [2024-05-15 04:23:40.140400] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefee00 name raid_bdev1, state offline 00:22:52.229 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.229 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:22:52.487 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:22:52.487 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:22:52.487 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:22:52.487 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:52.744 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:22:52.744 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:53.001 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:53.001 04:23:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:53.259 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:22:53.517 [2024-05-15 04:23:41.483713] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:53.517 [2024-05-15 04:23:41.485140] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:53.517 [2024-05-15 04:23:41.485210] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:53.517 [2024-05-15 04:23:41.485277] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:53.517 [2024-05-15 04:23:41.485303] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:53.517 [2024-05-15 04:23:41.485317] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefe2d0 name raid_bdev1, state configuring 00:22:53.517 request: 00:22:53.517 { 00:22:53.517 "name": "raid_bdev1", 00:22:53.517 "raid_level": "raid1", 00:22:53.517 "base_bdevs": [ 00:22:53.517 "malloc1", 00:22:53.517 "malloc2" 00:22:53.517 ], 00:22:53.517 "superblock": false, 00:22:53.517 "method": "bdev_raid_create", 00:22:53.517 "req_id": 1 00:22:53.517 } 00:22:53.517 Got JSON-RPC error response 00:22:53.517 response: 00:22:53.517 { 00:22:53.517 "code": -17, 00:22:53.517 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:53.517 } 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.517 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:22:53.775 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:22:53.775 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:22:53.775 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:54.033 [2024-05-15 04:23:41.968932] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:54.033 [2024-05-15 04:23:41.969009] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:54.033 [2024-05-15 04:23:41.969038] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda6e60 00:22:54.033 [2024-05-15 04:23:41.969054] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:54.033 [2024-05-15 04:23:41.970613] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:54.033 [2024-05-15 04:23:41.970649] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:54.033 [2024-05-15 04:23:41.970717] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:54.033 [2024-05-15 04:23:41.970757] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:54.033 pt1 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.033 04:23:41 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.290 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:54.290 "name": "raid_bdev1", 00:22:54.290 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:54.290 "strip_size_kb": 0, 00:22:54.290 "state": "configuring", 00:22:54.290 "raid_level": "raid1", 00:22:54.290 "superblock": true, 00:22:54.290 "num_base_bdevs": 2, 00:22:54.290 "num_base_bdevs_discovered": 1, 00:22:54.290 "num_base_bdevs_operational": 2, 00:22:54.290 "base_bdevs_list": [ 00:22:54.290 { 00:22:54.290 "name": "pt1", 00:22:54.290 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:54.290 "is_configured": true, 00:22:54.290 "data_offset": 256, 00:22:54.290 "data_size": 7936 00:22:54.290 }, 00:22:54.290 { 00:22:54.290 "name": null, 00:22:54.290 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:54.290 "is_configured": false, 00:22:54.290 "data_offset": 256, 00:22:54.290 "data_size": 7936 00:22:54.290 } 00:22:54.290 ] 00:22:54.290 }' 00:22:54.290 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:54.290 04:23:42 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:54.854 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:22:54.854 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:22:54.854 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:22:54.854 04:23:42 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:55.112 [2024-05-15 04:23:42.995882] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:55.112 [2024-05-15 04:23:42.995959] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:55.112 [2024-05-15 04:23:42.995989] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf40800 00:22:55.112 [2024-05-15 04:23:42.996005] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:55.112 [2024-05-15 04:23:42.996258] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:55.112 [2024-05-15 04:23:42.996283] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:55.112 [2024-05-15 04:23:42.996343] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:55.112 [2024-05-15 04:23:42.996372] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:55.112 [2024-05-15 04:23:42.996496] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xeff730 00:22:55.112 [2024-05-15 04:23:42.996512] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:55.112 [2024-05-15 04:23:42.996576] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf011a0 00:22:55.112 [2024-05-15 04:23:42.996703] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeff730 00:22:55.112 [2024-05-15 04:23:42.996719] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeff730 00:22:55.112 [2024-05-15 04:23:42.996805] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:55.112 pt2 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:55.112 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:55.113 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.113 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:55.370 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:55.370 "name": "raid_bdev1", 00:22:55.370 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:55.370 "strip_size_kb": 0, 00:22:55.370 "state": "online", 00:22:55.370 "raid_level": "raid1", 00:22:55.370 "superblock": true, 00:22:55.370 "num_base_bdevs": 2, 00:22:55.370 "num_base_bdevs_discovered": 2, 00:22:55.370 "num_base_bdevs_operational": 2, 00:22:55.370 "base_bdevs_list": [ 00:22:55.370 { 00:22:55.370 "name": "pt1", 00:22:55.370 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:55.370 "is_configured": true, 00:22:55.370 "data_offset": 256, 00:22:55.370 "data_size": 7936 00:22:55.370 }, 00:22:55.370 { 00:22:55.370 "name": "pt2", 00:22:55.370 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:55.370 "is_configured": true, 00:22:55.370 "data_offset": 256, 00:22:55.370 "data_size": 7936 00:22:55.370 } 00:22:55.370 ] 00:22:55.370 }' 00:22:55.370 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:55.370 04:23:43 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@199 -- # local name 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:55.946 04:23:43 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:22:56.204 [2024-05-15 04:23:44.022850] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:22:56.204 "name": "raid_bdev1", 00:22:56.204 "aliases": [ 00:22:56.204 "bc205998-8555-46d8-a499-21f348c367e2" 00:22:56.204 ], 00:22:56.204 "product_name": "Raid Volume", 00:22:56.204 "block_size": 4096, 00:22:56.204 "num_blocks": 7936, 00:22:56.204 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:56.204 "md_size": 32, 00:22:56.204 "md_interleave": false, 00:22:56.204 "dif_type": 0, 00:22:56.204 "assigned_rate_limits": { 00:22:56.204 "rw_ios_per_sec": 0, 00:22:56.204 "rw_mbytes_per_sec": 0, 00:22:56.204 "r_mbytes_per_sec": 0, 00:22:56.204 "w_mbytes_per_sec": 0 00:22:56.204 }, 00:22:56.204 "claimed": false, 00:22:56.204 "zoned": false, 00:22:56.204 "supported_io_types": { 00:22:56.204 "read": true, 00:22:56.204 "write": true, 00:22:56.204 "unmap": false, 00:22:56.204 "write_zeroes": true, 00:22:56.204 "flush": false, 00:22:56.204 "reset": true, 00:22:56.204 "compare": false, 00:22:56.204 "compare_and_write": false, 00:22:56.204 "abort": false, 00:22:56.204 "nvme_admin": false, 00:22:56.204 "nvme_io": false 00:22:56.204 }, 00:22:56.204 "memory_domains": [ 00:22:56.204 { 00:22:56.204 "dma_device_id": "system", 00:22:56.204 "dma_device_type": 1 00:22:56.204 }, 00:22:56.204 { 00:22:56.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.204 "dma_device_type": 2 00:22:56.204 }, 00:22:56.204 { 00:22:56.204 "dma_device_id": "system", 00:22:56.204 "dma_device_type": 1 00:22:56.204 }, 00:22:56.204 { 00:22:56.204 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.204 "dma_device_type": 2 00:22:56.204 } 00:22:56.204 ], 00:22:56.204 "driver_specific": { 00:22:56.204 "raid": { 00:22:56.204 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:56.204 "strip_size_kb": 0, 00:22:56.204 "state": "online", 00:22:56.204 "raid_level": "raid1", 00:22:56.204 "superblock": true, 00:22:56.204 "num_base_bdevs": 2, 00:22:56.204 "num_base_bdevs_discovered": 2, 00:22:56.204 "num_base_bdevs_operational": 2, 00:22:56.204 "base_bdevs_list": [ 00:22:56.204 { 00:22:56.204 "name": "pt1", 00:22:56.204 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:56.204 "is_configured": true, 00:22:56.204 "data_offset": 256, 00:22:56.204 "data_size": 7936 00:22:56.204 }, 00:22:56.204 { 00:22:56.204 "name": "pt2", 00:22:56.204 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:56.204 "is_configured": true, 00:22:56.204 "data_offset": 256, 00:22:56.204 "data_size": 7936 00:22:56.204 } 00:22:56.204 ] 00:22:56.204 } 00:22:56.204 } 00:22:56.204 }' 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:22:56.204 pt2' 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:56.204 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:56.461 "name": "pt1", 00:22:56.461 "aliases": [ 00:22:56.461 "4035852f-6c8a-58eb-9cff-6b28e3a3cf90" 00:22:56.461 ], 00:22:56.461 "product_name": "passthru", 00:22:56.461 "block_size": 4096, 00:22:56.461 "num_blocks": 8192, 00:22:56.461 "uuid": "4035852f-6c8a-58eb-9cff-6b28e3a3cf90", 00:22:56.461 "md_size": 32, 00:22:56.461 "md_interleave": false, 00:22:56.461 "dif_type": 0, 00:22:56.461 "assigned_rate_limits": { 00:22:56.461 "rw_ios_per_sec": 0, 00:22:56.461 "rw_mbytes_per_sec": 0, 00:22:56.461 "r_mbytes_per_sec": 0, 00:22:56.461 "w_mbytes_per_sec": 0 00:22:56.461 }, 00:22:56.461 "claimed": true, 00:22:56.461 "claim_type": "exclusive_write", 00:22:56.461 "zoned": false, 00:22:56.461 "supported_io_types": { 00:22:56.461 "read": true, 00:22:56.461 "write": true, 00:22:56.461 "unmap": true, 00:22:56.461 "write_zeroes": true, 00:22:56.461 "flush": true, 00:22:56.461 "reset": true, 00:22:56.461 "compare": false, 00:22:56.461 "compare_and_write": false, 00:22:56.461 "abort": true, 00:22:56.461 "nvme_admin": false, 00:22:56.461 "nvme_io": false 00:22:56.461 }, 00:22:56.461 "memory_domains": [ 00:22:56.461 { 00:22:56.461 "dma_device_id": "system", 00:22:56.461 "dma_device_type": 1 00:22:56.461 }, 00:22:56.461 { 00:22:56.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.461 "dma_device_type": 2 00:22:56.461 } 00:22:56.461 ], 00:22:56.461 "driver_specific": { 00:22:56.461 "passthru": { 00:22:56.461 "name": "pt1", 00:22:56.461 "base_bdev_name": "malloc1" 00:22:56.461 } 00:22:56.461 } 00:22:56.461 }' 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:56.461 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:56.718 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:22:56.976 "name": "pt2", 00:22:56.976 "aliases": [ 00:22:56.976 "7ea06c8e-c070-5d55-8b54-ee6134f6ec09" 00:22:56.976 ], 00:22:56.976 "product_name": "passthru", 00:22:56.976 "block_size": 4096, 00:22:56.976 "num_blocks": 8192, 00:22:56.976 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:56.976 "md_size": 32, 00:22:56.976 "md_interleave": false, 00:22:56.976 "dif_type": 0, 00:22:56.976 "assigned_rate_limits": { 00:22:56.976 "rw_ios_per_sec": 0, 00:22:56.976 "rw_mbytes_per_sec": 0, 00:22:56.976 "r_mbytes_per_sec": 0, 00:22:56.976 "w_mbytes_per_sec": 0 00:22:56.976 }, 00:22:56.976 "claimed": true, 00:22:56.976 "claim_type": "exclusive_write", 00:22:56.976 "zoned": false, 00:22:56.976 "supported_io_types": { 00:22:56.976 "read": true, 00:22:56.976 "write": true, 00:22:56.976 "unmap": true, 00:22:56.976 "write_zeroes": true, 00:22:56.976 "flush": true, 00:22:56.976 "reset": true, 00:22:56.976 "compare": false, 00:22:56.976 "compare_and_write": false, 00:22:56.976 "abort": true, 00:22:56.976 "nvme_admin": false, 00:22:56.976 "nvme_io": false 00:22:56.976 }, 00:22:56.976 "memory_domains": [ 00:22:56.976 { 00:22:56.976 "dma_device_id": "system", 00:22:56.976 "dma_device_type": 1 00:22:56.976 }, 00:22:56.976 { 00:22:56.976 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:56.976 "dma_device_type": 2 00:22:56.976 } 00:22:56.976 ], 00:22:56.976 "driver_specific": { 00:22:56.976 "passthru": { 00:22:56.976 "name": "pt2", 00:22:56.976 "base_bdev_name": "malloc2" 00:22:56.976 } 00:22:56.976 } 00:22:56.976 }' 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 4096 == 4096 ]] 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:56.976 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:22:57.234 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:22:57.234 04:23:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ false == false ]] 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:57.234 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:22:57.492 [2024-05-15 04:23:45.374464] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:57.492 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@487 -- # '[' bc205998-8555-46d8-a499-21f348c367e2 '!=' bc205998-8555-46d8-a499-21f348c367e2 ']' 00:22:57.492 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:22:57.492 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # case $1 in 00:22:57.492 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@215 -- # return 0 00:22:57.492 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:57.750 [2024-05-15 04:23:45.618947] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.750 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.008 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:58.008 "name": "raid_bdev1", 00:22:58.008 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:58.008 "strip_size_kb": 0, 00:22:58.008 "state": "online", 00:22:58.008 "raid_level": "raid1", 00:22:58.008 "superblock": true, 00:22:58.008 "num_base_bdevs": 2, 00:22:58.008 "num_base_bdevs_discovered": 1, 00:22:58.008 "num_base_bdevs_operational": 1, 00:22:58.008 "base_bdevs_list": [ 00:22:58.008 { 00:22:58.008 "name": null, 00:22:58.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.008 "is_configured": false, 00:22:58.008 "data_offset": 256, 00:22:58.008 "data_size": 7936 00:22:58.008 }, 00:22:58.008 { 00:22:58.008 "name": "pt2", 00:22:58.008 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:58.008 "is_configured": true, 00:22:58.008 "data_offset": 256, 00:22:58.008 "data_size": 7936 00:22:58.008 } 00:22:58.008 ] 00:22:58.008 }' 00:22:58.008 04:23:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:58.008 04:23:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:22:58.573 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:58.831 [2024-05-15 04:23:46.689732] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:58.831 [2024-05-15 04:23:46.689769] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:58.831 [2024-05-15 04:23:46.689872] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:58.831 [2024-05-15 04:23:46.689943] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:58.831 [2024-05-15 04:23:46.689960] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeff730 name raid_bdev1, state offline 00:22:58.831 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.831 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:22:59.088 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:22:59.088 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:22:59.088 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:22:59.088 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:22:59.088 04:23:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # i=1 00:22:59.345 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:59.602 [2024-05-15 04:23:47.467759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:59.602 [2024-05-15 04:23:47.467835] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:59.602 [2024-05-15 04:23:47.467866] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xda6160 00:22:59.602 [2024-05-15 04:23:47.467883] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:59.602 [2024-05-15 04:23:47.469449] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:59.603 [2024-05-15 04:23:47.469478] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:59.603 [2024-05-15 04:23:47.469541] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:59.603 [2024-05-15 04:23:47.469583] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:59.603 [2024-05-15 04:23:47.469704] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xf00c50 00:22:59.603 [2024-05-15 04:23:47.469721] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:22:59.603 [2024-05-15 04:23:47.469780] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf00580 00:22:59.603 [2024-05-15 04:23:47.469927] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf00c50 00:22:59.603 [2024-05-15 04:23:47.469944] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf00c50 00:22:59.603 [2024-05-15 04:23:47.470032] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:59.603 pt2 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.603 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.916 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:22:59.916 "name": "raid_bdev1", 00:22:59.916 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:22:59.916 "strip_size_kb": 0, 00:22:59.916 "state": "online", 00:22:59.916 "raid_level": "raid1", 00:22:59.916 "superblock": true, 00:22:59.916 "num_base_bdevs": 2, 00:22:59.916 "num_base_bdevs_discovered": 1, 00:22:59.916 "num_base_bdevs_operational": 1, 00:22:59.916 "base_bdevs_list": [ 00:22:59.916 { 00:22:59.916 "name": null, 00:22:59.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.916 "is_configured": false, 00:22:59.916 "data_offset": 256, 00:22:59.916 "data_size": 7936 00:22:59.916 }, 00:22:59.916 { 00:22:59.916 "name": "pt2", 00:22:59.916 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:22:59.916 "is_configured": true, 00:22:59.916 "data_offset": 256, 00:22:59.916 "data_size": 7936 00:22:59.916 } 00:22:59.916 ] 00:22:59.916 }' 00:22:59.916 04:23:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:22:59.916 04:23:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:00.499 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:00.499 [2024-05-15 04:23:48.502477] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:00.499 [2024-05-15 04:23:48.502507] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:00.499 [2024-05-15 04:23:48.502584] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:00.499 [2024-05-15 04:23:48.502647] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:00.499 [2024-05-15 04:23:48.502663] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00c50 name raid_bdev1, state offline 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@532 -- # '[' 2 -gt 2 ']' 00:23:00.757 04:23:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:01.014 [2024-05-15 04:23:48.987736] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:01.014 [2024-05-15 04:23:48.987807] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:01.014 [2024-05-15 04:23:48.987842] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf40bd0 00:23:01.014 [2024-05-15 04:23:48.987860] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:01.014 [2024-05-15 04:23:48.989428] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:01.014 [2024-05-15 04:23:48.989456] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:01.014 [2024-05-15 04:23:48.989527] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:01.014 [2024-05-15 04:23:48.989567] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:01.014 [2024-05-15 04:23:48.989683] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:01.014 [2024-05-15 04:23:48.989702] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:01.014 [2024-05-15 04:23:48.989729] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00ed0 name raid_bdev1, state configuring 00:23:01.014 [2024-05-15 04:23:48.989760] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:01.014 [2024-05-15 04:23:48.989846] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0xf00ed0 00:23:01.014 [2024-05-15 04:23:48.989862] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:01.014 [2024-05-15 04:23:48.989921] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf01d00 00:23:01.014 [2024-05-15 04:23:48.990044] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf00ed0 00:23:01.014 [2024-05-15 04:23:48.990060] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf00ed0 00:23:01.014 [2024-05-15 04:23:48.990141] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.014 pt1 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # '[' 2 -gt 2 ']' 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.014 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.272 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:01.272 "name": "raid_bdev1", 00:23:01.272 "uuid": "bc205998-8555-46d8-a499-21f348c367e2", 00:23:01.272 "strip_size_kb": 0, 00:23:01.272 "state": "online", 00:23:01.272 "raid_level": "raid1", 00:23:01.272 "superblock": true, 00:23:01.272 "num_base_bdevs": 2, 00:23:01.272 "num_base_bdevs_discovered": 1, 00:23:01.272 "num_base_bdevs_operational": 1, 00:23:01.272 "base_bdevs_list": [ 00:23:01.272 { 00:23:01.272 "name": null, 00:23:01.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.272 "is_configured": false, 00:23:01.272 "data_offset": 256, 00:23:01.272 "data_size": 7936 00:23:01.272 }, 00:23:01.272 { 00:23:01.272 "name": "pt2", 00:23:01.272 "uuid": "7ea06c8e-c070-5d55-8b54-ee6134f6ec09", 00:23:01.272 "is_configured": true, 00:23:01.272 "data_offset": 256, 00:23:01.272 "data_size": 7936 00:23:01.272 } 00:23:01.272 ] 00:23:01.272 }' 00:23:01.272 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:01.272 04:23:49 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:01.837 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:01.837 04:23:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:02.093 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:23:02.093 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:02.093 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:23:02.350 [2024-05-15 04:23:50.295437] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@558 -- # '[' bc205998-8555-46d8-a499-21f348c367e2 '!=' bc205998-8555-46d8-a499-21f348c367e2 ']' 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@563 -- # killprocess 3943623 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@946 -- # '[' -z 3943623 ']' 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # kill -0 3943623 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # uname 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3943623 00:23:02.350 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:02.351 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:02.351 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3943623' 00:23:02.351 killing process with pid 3943623 00:23:02.351 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@965 -- # kill 3943623 00:23:02.351 [2024-05-15 04:23:50.335486] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:02.351 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@970 -- # wait 3943623 00:23:02.351 [2024-05-15 04:23:50.335584] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:02.351 [2024-05-15 04:23:50.335651] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:02.351 [2024-05-15 04:23:50.335668] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf00ed0 name raid_bdev1, state offline 00:23:02.351 [2024-05-15 04:23:50.364953] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:02.916 04:23:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@565 -- # return 0 00:23:02.916 00:23:02.916 real 0m15.361s 00:23:02.916 user 0m28.274s 00:23:02.916 sys 0m2.199s 00:23:02.916 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:02.916 04:23:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:02.916 ************************************ 00:23:02.916 END TEST raid_superblock_test_md_separate 00:23:02.916 ************************************ 00:23:02.916 04:23:50 bdev_raid -- bdev/bdev_raid.sh@841 -- # '[' true = true ']' 00:23:02.916 04:23:50 bdev_raid -- bdev/bdev_raid.sh@842 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:23:02.916 04:23:50 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:23:02.916 04:23:50 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:02.916 04:23:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:02.916 ************************************ 00:23:02.916 START TEST raid_rebuild_test_sb_md_separate 00:23:02.916 ************************************ 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false true 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local verify=true 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local strip_size 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local create_arg 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@579 -- # local data_offset 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # raid_pid=3945782 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@598 -- # waitforlisten 3945782 /var/tmp/spdk-raid.sock 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@827 -- # '[' -z 3945782 ']' 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:02.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:02.916 04:23:50 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:02.916 [2024-05-15 04:23:50.758108] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:23:02.916 [2024-05-15 04:23:50.758185] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3945782 ] 00:23:02.916 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:02.916 Zero copy mechanism will not be used. 00:23:02.916 [2024-05-15 04:23:50.841043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:03.174 [2024-05-15 04:23:50.963077] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:03.174 [2024-05-15 04:23:51.034623] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:03.175 [2024-05-15 04:23:51.034668] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:03.740 04:23:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:03.740 04:23:51 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # return 0 00:23:03.740 04:23:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:23:03.740 04:23:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:23:03.998 BaseBdev1_malloc 00:23:03.998 04:23:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:04.255 [2024-05-15 04:23:52.271045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:04.255 [2024-05-15 04:23:52.271114] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:04.255 [2024-05-15 04:23:52.271151] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e4ac0 00:23:04.255 [2024-05-15 04:23:52.271168] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:04.513 [2024-05-15 04:23:52.272901] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:04.513 [2024-05-15 04:23:52.272930] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:04.513 BaseBdev1 00:23:04.513 04:23:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:23:04.513 04:23:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:23:04.771 BaseBdev2_malloc 00:23:04.771 04:23:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:05.028 [2024-05-15 04:23:52.861680] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:05.028 [2024-05-15 04:23:52.861740] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.028 [2024-05-15 04:23:52.861766] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2865100 00:23:05.028 [2024-05-15 04:23:52.861779] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.028 [2024-05-15 04:23:52.863287] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.028 [2024-05-15 04:23:52.863313] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:05.028 BaseBdev2 00:23:05.028 04:23:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:23:05.286 spare_malloc 00:23:05.286 04:23:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:05.545 spare_delay 00:23:05.545 04:23:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:05.810 [2024-05-15 04:23:53.712003] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:05.810 [2024-05-15 04:23:53.712067] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:05.810 [2024-05-15 04:23:53.712101] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26db380 00:23:05.810 [2024-05-15 04:23:53.712117] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:05.810 [2024-05-15 04:23:53.713764] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:05.810 [2024-05-15 04:23:53.713793] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:05.810 spare 00:23:05.810 04:23:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:06.069 [2024-05-15 04:23:54.000805] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:06.069 [2024-05-15 04:23:54.002238] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:06.069 [2024-05-15 04:23:54.002446] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x26dc220 00:23:06.069 [2024-05-15 04:23:54.002465] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:06.069 [2024-05-15 04:23:54.002578] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2865c20 00:23:06.069 [2024-05-15 04:23:54.002736] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26dc220 00:23:06.069 [2024-05-15 04:23:54.002753] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26dc220 00:23:06.069 [2024-05-15 04:23:54.002864] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.069 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.327 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:06.327 "name": "raid_bdev1", 00:23:06.327 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:06.327 "strip_size_kb": 0, 00:23:06.327 "state": "online", 00:23:06.327 "raid_level": "raid1", 00:23:06.327 "superblock": true, 00:23:06.327 "num_base_bdevs": 2, 00:23:06.327 "num_base_bdevs_discovered": 2, 00:23:06.327 "num_base_bdevs_operational": 2, 00:23:06.327 "base_bdevs_list": [ 00:23:06.327 { 00:23:06.327 "name": "BaseBdev1", 00:23:06.327 "uuid": "f034b531-11c4-5572-b0ab-1a1723775b6e", 00:23:06.327 "is_configured": true, 00:23:06.327 "data_offset": 256, 00:23:06.327 "data_size": 7936 00:23:06.327 }, 00:23:06.327 { 00:23:06.327 "name": "BaseBdev2", 00:23:06.327 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:06.327 "is_configured": true, 00:23:06.327 "data_offset": 256, 00:23:06.327 "data_size": 7936 00:23:06.327 } 00:23:06.327 ] 00:23:06.327 }' 00:23:06.327 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:06.327 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:06.892 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:06.892 04:23:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:23:07.149 [2024-05-15 04:23:55.144042] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:07.149 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=7936 00:23:07.149 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.149 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@619 -- # data_offset=256 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # '[' true = true ']' 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@625 -- # local write_unit_size 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.406 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:07.664 [2024-05-15 04:23:55.669266] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2865c20 00:23:07.922 /dev/nbd0 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:07.922 1+0 records in 00:23:07.922 1+0 records out 00:23:07.922 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189042 s, 21.7 MB/s 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@629 -- # '[' raid1 = raid5f ']' 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@633 -- # write_unit_size=1 00:23:07.922 04:23:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:23:08.487 7936+0 records in 00:23:08.487 7936+0 records out 00:23:08.487 32505856 bytes (33 MB, 31 MiB) copied, 0.776235 s, 41.9 MB/s 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@636 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.488 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:09.053 [2024-05-15 04:23:56.771615] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:23:09.053 04:23:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:09.053 [2024-05-15 04:23:56.990271] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.053 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.311 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:09.311 "name": "raid_bdev1", 00:23:09.311 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:09.311 "strip_size_kb": 0, 00:23:09.311 "state": "online", 00:23:09.311 "raid_level": "raid1", 00:23:09.311 "superblock": true, 00:23:09.311 "num_base_bdevs": 2, 00:23:09.311 "num_base_bdevs_discovered": 1, 00:23:09.311 "num_base_bdevs_operational": 1, 00:23:09.311 "base_bdevs_list": [ 00:23:09.311 { 00:23:09.311 "name": null, 00:23:09.311 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.311 "is_configured": false, 00:23:09.311 "data_offset": 256, 00:23:09.311 "data_size": 7936 00:23:09.311 }, 00:23:09.311 { 00:23:09.311 "name": "BaseBdev2", 00:23:09.311 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:09.311 "is_configured": true, 00:23:09.311 "data_offset": 256, 00:23:09.311 "data_size": 7936 00:23:09.311 } 00:23:09.311 ] 00:23:09.311 }' 00:23:09.311 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:09.311 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:09.875 04:23:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:10.133 [2024-05-15 04:23:58.041081] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:10.133 [2024-05-15 04:23:58.043977] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26e3a60 00:23:10.133 [2024-05-15 04:23:58.045953] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:10.133 04:23:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@647 -- # sleep 1 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.064 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.322 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:11.322 "name": "raid_bdev1", 00:23:11.322 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:11.322 "strip_size_kb": 0, 00:23:11.322 "state": "online", 00:23:11.322 "raid_level": "raid1", 00:23:11.322 "superblock": true, 00:23:11.322 "num_base_bdevs": 2, 00:23:11.322 "num_base_bdevs_discovered": 2, 00:23:11.322 "num_base_bdevs_operational": 2, 00:23:11.322 "process": { 00:23:11.322 "type": "rebuild", 00:23:11.322 "target": "spare", 00:23:11.322 "progress": { 00:23:11.322 "blocks": 3072, 00:23:11.322 "percent": 38 00:23:11.322 } 00:23:11.322 }, 00:23:11.322 "base_bdevs_list": [ 00:23:11.322 { 00:23:11.322 "name": "spare", 00:23:11.322 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:11.322 "is_configured": true, 00:23:11.322 "data_offset": 256, 00:23:11.322 "data_size": 7936 00:23:11.322 }, 00:23:11.322 { 00:23:11.322 "name": "BaseBdev2", 00:23:11.322 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:11.322 "is_configured": true, 00:23:11.322 "data_offset": 256, 00:23:11.322 "data_size": 7936 00:23:11.322 } 00:23:11.322 ] 00:23:11.322 }' 00:23:11.322 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:11.580 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:11.580 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:11.580 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:11.580 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:11.842 [2024-05-15 04:23:59.619205] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:11.842 [2024-05-15 04:23:59.659318] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:11.842 [2024-05-15 04:23:59.659376] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.842 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.100 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:12.100 "name": "raid_bdev1", 00:23:12.100 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:12.100 "strip_size_kb": 0, 00:23:12.100 "state": "online", 00:23:12.100 "raid_level": "raid1", 00:23:12.100 "superblock": true, 00:23:12.100 "num_base_bdevs": 2, 00:23:12.100 "num_base_bdevs_discovered": 1, 00:23:12.100 "num_base_bdevs_operational": 1, 00:23:12.100 "base_bdevs_list": [ 00:23:12.100 { 00:23:12.100 "name": null, 00:23:12.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.100 "is_configured": false, 00:23:12.100 "data_offset": 256, 00:23:12.100 "data_size": 7936 00:23:12.100 }, 00:23:12.100 { 00:23:12.100 "name": "BaseBdev2", 00:23:12.100 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:12.100 "is_configured": true, 00:23:12.100 "data_offset": 256, 00:23:12.100 "data_size": 7936 00:23:12.100 } 00:23:12.100 ] 00:23:12.100 }' 00:23:12.100 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:12.100 04:23:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.665 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:12.923 "name": "raid_bdev1", 00:23:12.923 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:12.923 "strip_size_kb": 0, 00:23:12.923 "state": "online", 00:23:12.923 "raid_level": "raid1", 00:23:12.923 "superblock": true, 00:23:12.923 "num_base_bdevs": 2, 00:23:12.923 "num_base_bdevs_discovered": 1, 00:23:12.923 "num_base_bdevs_operational": 1, 00:23:12.923 "base_bdevs_list": [ 00:23:12.923 { 00:23:12.923 "name": null, 00:23:12.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.923 "is_configured": false, 00:23:12.923 "data_offset": 256, 00:23:12.923 "data_size": 7936 00:23:12.923 }, 00:23:12.923 { 00:23:12.923 "name": "BaseBdev2", 00:23:12.923 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:12.923 "is_configured": true, 00:23:12.923 "data_offset": 256, 00:23:12.923 "data_size": 7936 00:23:12.923 } 00:23:12.923 ] 00:23:12.923 }' 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:12.923 04:24:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:13.181 [2024-05-15 04:24:01.063103] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.181 [2024-05-15 04:24:01.066100] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ebb90 00:23:13.181 [2024-05-15 04:24:01.067601] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:13.181 04:24:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # sleep 1 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.113 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.370 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:14.370 "name": "raid_bdev1", 00:23:14.370 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:14.370 "strip_size_kb": 0, 00:23:14.370 "state": "online", 00:23:14.370 "raid_level": "raid1", 00:23:14.370 "superblock": true, 00:23:14.370 "num_base_bdevs": 2, 00:23:14.370 "num_base_bdevs_discovered": 2, 00:23:14.370 "num_base_bdevs_operational": 2, 00:23:14.370 "process": { 00:23:14.370 "type": "rebuild", 00:23:14.370 "target": "spare", 00:23:14.370 "progress": { 00:23:14.370 "blocks": 3072, 00:23:14.370 "percent": 38 00:23:14.370 } 00:23:14.370 }, 00:23:14.370 "base_bdevs_list": [ 00:23:14.370 { 00:23:14.370 "name": "spare", 00:23:14.370 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:14.370 "is_configured": true, 00:23:14.370 "data_offset": 256, 00:23:14.370 "data_size": 7936 00:23:14.370 }, 00:23:14.370 { 00:23:14.370 "name": "BaseBdev2", 00:23:14.370 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:14.370 "is_configured": true, 00:23:14.370 "data_offset": 256, 00:23:14.370 "data_size": 7936 00:23:14.370 } 00:23:14.370 ] 00:23:14.370 }' 00:23:14.370 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:14.370 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.370 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:23:14.628 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # local timeout=918 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.628 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:14.885 "name": "raid_bdev1", 00:23:14.885 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:14.885 "strip_size_kb": 0, 00:23:14.885 "state": "online", 00:23:14.885 "raid_level": "raid1", 00:23:14.885 "superblock": true, 00:23:14.885 "num_base_bdevs": 2, 00:23:14.885 "num_base_bdevs_discovered": 2, 00:23:14.885 "num_base_bdevs_operational": 2, 00:23:14.885 "process": { 00:23:14.885 "type": "rebuild", 00:23:14.885 "target": "spare", 00:23:14.885 "progress": { 00:23:14.885 "blocks": 3840, 00:23:14.885 "percent": 48 00:23:14.885 } 00:23:14.885 }, 00:23:14.885 "base_bdevs_list": [ 00:23:14.885 { 00:23:14.885 "name": "spare", 00:23:14.885 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:14.885 "is_configured": true, 00:23:14.885 "data_offset": 256, 00:23:14.885 "data_size": 7936 00:23:14.885 }, 00:23:14.885 { 00:23:14.885 "name": "BaseBdev2", 00:23:14.885 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:14.885 "is_configured": true, 00:23:14.885 "data_offset": 256, 00:23:14.885 "data_size": 7936 00:23:14.885 } 00:23:14.885 ] 00:23:14.885 }' 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.885 04:24:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@711 -- # sleep 1 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.828 04:24:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:16.086 "name": "raid_bdev1", 00:23:16.086 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:16.086 "strip_size_kb": 0, 00:23:16.086 "state": "online", 00:23:16.086 "raid_level": "raid1", 00:23:16.086 "superblock": true, 00:23:16.086 "num_base_bdevs": 2, 00:23:16.086 "num_base_bdevs_discovered": 2, 00:23:16.086 "num_base_bdevs_operational": 2, 00:23:16.086 "process": { 00:23:16.086 "type": "rebuild", 00:23:16.086 "target": "spare", 00:23:16.086 "progress": { 00:23:16.086 "blocks": 7168, 00:23:16.086 "percent": 90 00:23:16.086 } 00:23:16.086 }, 00:23:16.086 "base_bdevs_list": [ 00:23:16.086 { 00:23:16.086 "name": "spare", 00:23:16.086 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:16.086 "is_configured": true, 00:23:16.086 "data_offset": 256, 00:23:16.086 "data_size": 7936 00:23:16.086 }, 00:23:16.086 { 00:23:16.086 "name": "BaseBdev2", 00:23:16.086 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:16.086 "is_configured": true, 00:23:16.086 "data_offset": 256, 00:23:16.086 "data_size": 7936 00:23:16.086 } 00:23:16.086 ] 00:23:16.086 }' 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.086 04:24:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@711 -- # sleep 1 00:23:16.362 [2024-05-15 04:24:04.193341] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:16.362 [2024-05-15 04:24:04.193401] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:16.362 [2024-05-15 04:24:04.193517] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.292 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:17.646 "name": "raid_bdev1", 00:23:17.646 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:17.646 "strip_size_kb": 0, 00:23:17.646 "state": "online", 00:23:17.646 "raid_level": "raid1", 00:23:17.646 "superblock": true, 00:23:17.646 "num_base_bdevs": 2, 00:23:17.646 "num_base_bdevs_discovered": 2, 00:23:17.646 "num_base_bdevs_operational": 2, 00:23:17.646 "base_bdevs_list": [ 00:23:17.646 { 00:23:17.646 "name": "spare", 00:23:17.646 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:17.646 "is_configured": true, 00:23:17.646 "data_offset": 256, 00:23:17.646 "data_size": 7936 00:23:17.646 }, 00:23:17.646 { 00:23:17.646 "name": "BaseBdev2", 00:23:17.646 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:17.646 "is_configured": true, 00:23:17.646 "data_offset": 256, 00:23:17.646 "data_size": 7936 00:23:17.646 } 00:23:17.646 ] 00:23:17.646 }' 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@709 -- # break 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.646 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:17.904 "name": "raid_bdev1", 00:23:17.904 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:17.904 "strip_size_kb": 0, 00:23:17.904 "state": "online", 00:23:17.904 "raid_level": "raid1", 00:23:17.904 "superblock": true, 00:23:17.904 "num_base_bdevs": 2, 00:23:17.904 "num_base_bdevs_discovered": 2, 00:23:17.904 "num_base_bdevs_operational": 2, 00:23:17.904 "base_bdevs_list": [ 00:23:17.904 { 00:23:17.904 "name": "spare", 00:23:17.904 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:17.904 "is_configured": true, 00:23:17.904 "data_offset": 256, 00:23:17.904 "data_size": 7936 00:23:17.904 }, 00:23:17.904 { 00:23:17.904 "name": "BaseBdev2", 00:23:17.904 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:17.904 "is_configured": true, 00:23:17.904 "data_offset": 256, 00:23:17.904 "data_size": 7936 00:23:17.904 } 00:23:17.904 ] 00:23:17.904 }' 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.904 04:24:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.162 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:18.162 "name": "raid_bdev1", 00:23:18.162 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:18.162 "strip_size_kb": 0, 00:23:18.162 "state": "online", 00:23:18.162 "raid_level": "raid1", 00:23:18.162 "superblock": true, 00:23:18.162 "num_base_bdevs": 2, 00:23:18.162 "num_base_bdevs_discovered": 2, 00:23:18.162 "num_base_bdevs_operational": 2, 00:23:18.162 "base_bdevs_list": [ 00:23:18.162 { 00:23:18.162 "name": "spare", 00:23:18.162 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:18.162 "is_configured": true, 00:23:18.162 "data_offset": 256, 00:23:18.162 "data_size": 7936 00:23:18.162 }, 00:23:18.162 { 00:23:18.162 "name": "BaseBdev2", 00:23:18.162 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:18.162 "is_configured": true, 00:23:18.162 "data_offset": 256, 00:23:18.162 "data_size": 7936 00:23:18.162 } 00:23:18.162 ] 00:23:18.162 }' 00:23:18.162 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:18.162 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:18.726 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:18.983 [2024-05-15 04:24:06.844486] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:18.983 [2024-05-15 04:24:06.844518] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:18.983 [2024-05-15 04:24:06.844592] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:18.983 [2024-05-15 04:24:06.844669] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:18.983 [2024-05-15 04:24:06.844687] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26dc220 name raid_bdev1, state offline 00:23:18.983 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.983 04:24:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@720 -- # jq length 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # '[' false = true ']' 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.241 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:19.498 /dev/nbd0 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:19.498 1+0 records in 00:23:19.498 1+0 records out 00:23:19.498 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000159541 s, 25.7 MB/s 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.498 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:19.755 /dev/nbd1 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@865 -- # local i 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # break 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:19.755 1+0 records in 00:23:19.755 1+0 records out 00:23:19.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219151 s, 18.7 MB/s 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # size=4096 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # return 0 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@739 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:19.755 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:20.011 04:24:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:23:20.267 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:20.530 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:20.836 [2024-05-15 04:24:08.732611] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:20.836 [2024-05-15 04:24:08.732673] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:20.836 [2024-05-15 04:24:08.732700] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26dcb80 00:23:20.836 [2024-05-15 04:24:08.732715] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:20.836 [2024-05-15 04:24:08.734332] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:20.836 [2024-05-15 04:24:08.734361] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:20.836 [2024-05-15 04:24:08.734437] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:20.836 [2024-05-15 04:24:08.734478] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:20.836 [2024-05-15 04:24:08.734596] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:20.836 spare 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.836 04:24:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.120 [2024-05-15 04:24:08.834929] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x26deb00 00:23:21.120 [2024-05-15 04:24:08.834952] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:23:21.120 [2024-05-15 04:24:08.835048] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2865c20 00:23:21.120 [2024-05-15 04:24:08.835229] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26deb00 00:23:21.120 [2024-05-15 04:24:08.835247] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26deb00 00:23:21.120 [2024-05-15 04:24:08.835349] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.120 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:21.120 "name": "raid_bdev1", 00:23:21.120 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:21.120 "strip_size_kb": 0, 00:23:21.120 "state": "online", 00:23:21.120 "raid_level": "raid1", 00:23:21.120 "superblock": true, 00:23:21.120 "num_base_bdevs": 2, 00:23:21.120 "num_base_bdevs_discovered": 2, 00:23:21.120 "num_base_bdevs_operational": 2, 00:23:21.120 "base_bdevs_list": [ 00:23:21.120 { 00:23:21.120 "name": "spare", 00:23:21.120 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:21.120 "is_configured": true, 00:23:21.120 "data_offset": 256, 00:23:21.120 "data_size": 7936 00:23:21.120 }, 00:23:21.120 { 00:23:21.120 "name": "BaseBdev2", 00:23:21.120 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:21.120 "is_configured": true, 00:23:21.120 "data_offset": 256, 00:23:21.120 "data_size": 7936 00:23:21.120 } 00:23:21.120 ] 00:23:21.120 }' 00:23:21.120 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:21.120 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.687 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:21.945 "name": "raid_bdev1", 00:23:21.945 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:21.945 "strip_size_kb": 0, 00:23:21.945 "state": "online", 00:23:21.945 "raid_level": "raid1", 00:23:21.945 "superblock": true, 00:23:21.945 "num_base_bdevs": 2, 00:23:21.945 "num_base_bdevs_discovered": 2, 00:23:21.945 "num_base_bdevs_operational": 2, 00:23:21.945 "base_bdevs_list": [ 00:23:21.945 { 00:23:21.945 "name": "spare", 00:23:21.945 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:21.945 "is_configured": true, 00:23:21.945 "data_offset": 256, 00:23:21.945 "data_size": 7936 00:23:21.945 }, 00:23:21.945 { 00:23:21.945 "name": "BaseBdev2", 00:23:21.945 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:21.945 "is_configured": true, 00:23:21.945 "data_offset": 256, 00:23:21.945 "data_size": 7936 00:23:21.945 } 00:23:21.945 ] 00:23:21.945 }' 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.945 04:24:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:22.203 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:23:22.203 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:22.460 [2024-05-15 04:24:10.320977] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.460 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:22.717 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:22.717 "name": "raid_bdev1", 00:23:22.717 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:22.717 "strip_size_kb": 0, 00:23:22.717 "state": "online", 00:23:22.717 "raid_level": "raid1", 00:23:22.717 "superblock": true, 00:23:22.717 "num_base_bdevs": 2, 00:23:22.717 "num_base_bdevs_discovered": 1, 00:23:22.717 "num_base_bdevs_operational": 1, 00:23:22.717 "base_bdevs_list": [ 00:23:22.717 { 00:23:22.717 "name": null, 00:23:22.717 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:22.717 "is_configured": false, 00:23:22.717 "data_offset": 256, 00:23:22.717 "data_size": 7936 00:23:22.717 }, 00:23:22.717 { 00:23:22.717 "name": "BaseBdev2", 00:23:22.717 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:22.717 "is_configured": true, 00:23:22.717 "data_offset": 256, 00:23:22.717 "data_size": 7936 00:23:22.717 } 00:23:22.717 ] 00:23:22.717 }' 00:23:22.717 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:22.717 04:24:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:23.326 04:24:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:23.326 [2024-05-15 04:24:11.311655] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:23.326 [2024-05-15 04:24:11.311872] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:23.326 [2024-05-15 04:24:11.311895] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:23.326 [2024-05-15 04:24:11.311929] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:23.326 [2024-05-15 04:24:11.314622] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26dce10 00:23:23.326 [2024-05-15 04:24:11.316632] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:23.326 04:24:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # sleep 1 00:23:24.696 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:24.697 "name": "raid_bdev1", 00:23:24.697 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:24.697 "strip_size_kb": 0, 00:23:24.697 "state": "online", 00:23:24.697 "raid_level": "raid1", 00:23:24.697 "superblock": true, 00:23:24.697 "num_base_bdevs": 2, 00:23:24.697 "num_base_bdevs_discovered": 2, 00:23:24.697 "num_base_bdevs_operational": 2, 00:23:24.697 "process": { 00:23:24.697 "type": "rebuild", 00:23:24.697 "target": "spare", 00:23:24.697 "progress": { 00:23:24.697 "blocks": 3072, 00:23:24.697 "percent": 38 00:23:24.697 } 00:23:24.697 }, 00:23:24.697 "base_bdevs_list": [ 00:23:24.697 { 00:23:24.697 "name": "spare", 00:23:24.697 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:24.697 "is_configured": true, 00:23:24.697 "data_offset": 256, 00:23:24.697 "data_size": 7936 00:23:24.697 }, 00:23:24.697 { 00:23:24.697 "name": "BaseBdev2", 00:23:24.697 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:24.697 "is_configured": true, 00:23:24.697 "data_offset": 256, 00:23:24.697 "data_size": 7936 00:23:24.697 } 00:23:24.697 ] 00:23:24.697 }' 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:24.697 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:24.954 [2024-05-15 04:24:12.874045] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:24.954 [2024-05-15 04:24:12.930155] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:24.954 [2024-05-15 04:24:12.930218] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:24.954 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:24.955 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:24.955 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:24.955 04:24:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.213 04:24:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:25.213 "name": "raid_bdev1", 00:23:25.213 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:25.213 "strip_size_kb": 0, 00:23:25.213 "state": "online", 00:23:25.213 "raid_level": "raid1", 00:23:25.213 "superblock": true, 00:23:25.213 "num_base_bdevs": 2, 00:23:25.213 "num_base_bdevs_discovered": 1, 00:23:25.213 "num_base_bdevs_operational": 1, 00:23:25.213 "base_bdevs_list": [ 00:23:25.213 { 00:23:25.213 "name": null, 00:23:25.213 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:25.213 "is_configured": false, 00:23:25.213 "data_offset": 256, 00:23:25.213 "data_size": 7936 00:23:25.213 }, 00:23:25.213 { 00:23:25.213 "name": "BaseBdev2", 00:23:25.213 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:25.213 "is_configured": true, 00:23:25.213 "data_offset": 256, 00:23:25.213 "data_size": 7936 00:23:25.213 } 00:23:25.213 ] 00:23:25.213 }' 00:23:25.213 04:24:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:25.213 04:24:13 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:25.777 04:24:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:26.035 [2024-05-15 04:24:13.969323] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:26.035 [2024-05-15 04:24:13.969380] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:26.035 [2024-05-15 04:24:13.969408] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2867170 00:23:26.035 [2024-05-15 04:24:13.969424] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:26.035 [2024-05-15 04:24:13.969692] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:26.035 [2024-05-15 04:24:13.969717] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:26.035 [2024-05-15 04:24:13.969785] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:26.035 [2024-05-15 04:24:13.969804] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:26.035 [2024-05-15 04:24:13.969815] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:26.035 [2024-05-15 04:24:13.969847] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:26.035 [2024-05-15 04:24:13.972581] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26dce10 00:23:26.035 [2024-05-15 04:24:13.974055] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:26.035 spare 00:23:26.035 04:24:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # sleep 1 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=spare 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.405 04:24:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:27.405 "name": "raid_bdev1", 00:23:27.405 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:27.405 "strip_size_kb": 0, 00:23:27.405 "state": "online", 00:23:27.405 "raid_level": "raid1", 00:23:27.405 "superblock": true, 00:23:27.405 "num_base_bdevs": 2, 00:23:27.405 "num_base_bdevs_discovered": 2, 00:23:27.405 "num_base_bdevs_operational": 2, 00:23:27.405 "process": { 00:23:27.405 "type": "rebuild", 00:23:27.405 "target": "spare", 00:23:27.405 "progress": { 00:23:27.405 "blocks": 3072, 00:23:27.405 "percent": 38 00:23:27.405 } 00:23:27.405 }, 00:23:27.405 "base_bdevs_list": [ 00:23:27.405 { 00:23:27.405 "name": "spare", 00:23:27.405 "uuid": "38bec6bf-95be-5816-adc3-e8c533de5440", 00:23:27.405 "is_configured": true, 00:23:27.405 "data_offset": 256, 00:23:27.405 "data_size": 7936 00:23:27.405 }, 00:23:27.405 { 00:23:27.405 "name": "BaseBdev2", 00:23:27.405 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:27.405 "is_configured": true, 00:23:27.405 "data_offset": 256, 00:23:27.405 "data_size": 7936 00:23:27.405 } 00:23:27.405 ] 00:23:27.405 }' 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:23:27.405 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:27.663 [2024-05-15 04:24:15.547305] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:27.663 [2024-05-15 04:24:15.587366] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:27.663 [2024-05-15 04:24:15.587424] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.663 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.921 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:27.921 "name": "raid_bdev1", 00:23:27.921 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:27.921 "strip_size_kb": 0, 00:23:27.921 "state": "online", 00:23:27.921 "raid_level": "raid1", 00:23:27.921 "superblock": true, 00:23:27.921 "num_base_bdevs": 2, 00:23:27.921 "num_base_bdevs_discovered": 1, 00:23:27.921 "num_base_bdevs_operational": 1, 00:23:27.921 "base_bdevs_list": [ 00:23:27.921 { 00:23:27.921 "name": null, 00:23:27.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.921 "is_configured": false, 00:23:27.921 "data_offset": 256, 00:23:27.921 "data_size": 7936 00:23:27.921 }, 00:23:27.921 { 00:23:27.921 "name": "BaseBdev2", 00:23:27.921 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:27.921 "is_configured": true, 00:23:27.921 "data_offset": 256, 00:23:27.921 "data_size": 7936 00:23:27.921 } 00:23:27.921 ] 00:23:27.921 }' 00:23:27.921 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:27.921 04:24:15 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.486 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:28.744 "name": "raid_bdev1", 00:23:28.744 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:28.744 "strip_size_kb": 0, 00:23:28.744 "state": "online", 00:23:28.744 "raid_level": "raid1", 00:23:28.744 "superblock": true, 00:23:28.744 "num_base_bdevs": 2, 00:23:28.744 "num_base_bdevs_discovered": 1, 00:23:28.744 "num_base_bdevs_operational": 1, 00:23:28.744 "base_bdevs_list": [ 00:23:28.744 { 00:23:28.744 "name": null, 00:23:28.744 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.744 "is_configured": false, 00:23:28.744 "data_offset": 256, 00:23:28.744 "data_size": 7936 00:23:28.744 }, 00:23:28.744 { 00:23:28.744 "name": "BaseBdev2", 00:23:28.744 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:28.744 "is_configured": true, 00:23:28.744 "data_offset": 256, 00:23:28.744 "data_size": 7936 00:23:28.744 } 00:23:28.744 ] 00:23:28.744 }' 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:28.744 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:29.002 04:24:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:29.260 [2024-05-15 04:24:17.208035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:29.260 [2024-05-15 04:24:17.208105] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:29.260 [2024-05-15 04:24:17.208132] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26e4cf0 00:23:29.260 [2024-05-15 04:24:17.208160] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:29.260 [2024-05-15 04:24:17.208386] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:29.260 [2024-05-15 04:24:17.208407] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:29.260 [2024-05-15 04:24:17.208460] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:29.260 [2024-05-15 04:24:17.208475] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:29.260 [2024-05-15 04:24:17.208484] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:29.260 BaseBdev1 00:23:29.260 04:24:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # sleep 1 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:30.632 "name": "raid_bdev1", 00:23:30.632 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:30.632 "strip_size_kb": 0, 00:23:30.632 "state": "online", 00:23:30.632 "raid_level": "raid1", 00:23:30.632 "superblock": true, 00:23:30.632 "num_base_bdevs": 2, 00:23:30.632 "num_base_bdevs_discovered": 1, 00:23:30.632 "num_base_bdevs_operational": 1, 00:23:30.632 "base_bdevs_list": [ 00:23:30.632 { 00:23:30.632 "name": null, 00:23:30.632 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.632 "is_configured": false, 00:23:30.632 "data_offset": 256, 00:23:30.632 "data_size": 7936 00:23:30.632 }, 00:23:30.632 { 00:23:30.632 "name": "BaseBdev2", 00:23:30.632 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:30.632 "is_configured": true, 00:23:30.632 "data_offset": 256, 00:23:30.632 "data_size": 7936 00:23:30.632 } 00:23:30.632 ] 00:23:30.632 }' 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:30.632 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:31.196 04:24:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.196 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:31.454 "name": "raid_bdev1", 00:23:31.454 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:31.454 "strip_size_kb": 0, 00:23:31.454 "state": "online", 00:23:31.454 "raid_level": "raid1", 00:23:31.454 "superblock": true, 00:23:31.454 "num_base_bdevs": 2, 00:23:31.454 "num_base_bdevs_discovered": 1, 00:23:31.454 "num_base_bdevs_operational": 1, 00:23:31.454 "base_bdevs_list": [ 00:23:31.454 { 00:23:31.454 "name": null, 00:23:31.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.454 "is_configured": false, 00:23:31.454 "data_offset": 256, 00:23:31.454 "data_size": 7936 00:23:31.454 }, 00:23:31.454 { 00:23:31.454 "name": "BaseBdev2", 00:23:31.454 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:31.454 "is_configured": true, 00:23:31.454 "data_offset": 256, 00:23:31.454 "data_size": 7936 00:23:31.454 } 00:23:31.454 ] 00:23:31.454 }' 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:31.454 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:31.712 [2024-05-15 04:24:19.598372] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:31.712 [2024-05-15 04:24:19.598540] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:31.712 [2024-05-15 04:24:19.598562] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:31.712 request: 00:23:31.712 { 00:23:31.712 "raid_bdev": "raid_bdev1", 00:23:31.712 "base_bdev": "BaseBdev1", 00:23:31.712 "method": "bdev_raid_add_base_bdev", 00:23:31.712 "req_id": 1 00:23:31.712 } 00:23:31.712 Got JSON-RPC error response 00:23:31.712 response: 00:23:31.712 { 00:23:31.712 "code": -22, 00:23:31.712 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:31.712 } 00:23:31.712 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:23:31.712 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:31.712 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:31.712 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:31.712 04:24:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # sleep 1 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.645 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.903 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:32.903 "name": "raid_bdev1", 00:23:32.903 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:32.903 "strip_size_kb": 0, 00:23:32.903 "state": "online", 00:23:32.903 "raid_level": "raid1", 00:23:32.903 "superblock": true, 00:23:32.903 "num_base_bdevs": 2, 00:23:32.903 "num_base_bdevs_discovered": 1, 00:23:32.903 "num_base_bdevs_operational": 1, 00:23:32.903 "base_bdevs_list": [ 00:23:32.903 { 00:23:32.903 "name": null, 00:23:32.903 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.903 "is_configured": false, 00:23:32.903 "data_offset": 256, 00:23:32.903 "data_size": 7936 00:23:32.903 }, 00:23:32.903 { 00:23:32.903 "name": "BaseBdev2", 00:23:32.903 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:32.903 "is_configured": true, 00:23:32.903 "data_offset": 256, 00:23:32.903 "data_size": 7936 00:23:32.903 } 00:23:32.903 ] 00:23:32.903 }' 00:23:32.903 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:32.903 04:24:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local target=none 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.467 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.725 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:23:33.725 "name": "raid_bdev1", 00:23:33.725 "uuid": "6d0cc0f6-08d6-47be-95a9-6bfd449bd008", 00:23:33.725 "strip_size_kb": 0, 00:23:33.725 "state": "online", 00:23:33.725 "raid_level": "raid1", 00:23:33.725 "superblock": true, 00:23:33.725 "num_base_bdevs": 2, 00:23:33.725 "num_base_bdevs_discovered": 1, 00:23:33.725 "num_base_bdevs_operational": 1, 00:23:33.725 "base_bdevs_list": [ 00:23:33.725 { 00:23:33.725 "name": null, 00:23:33.725 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.725 "is_configured": false, 00:23:33.725 "data_offset": 256, 00:23:33.725 "data_size": 7936 00:23:33.725 }, 00:23:33.725 { 00:23:33.725 "name": "BaseBdev2", 00:23:33.725 "uuid": "1c8e9758-734b-5b75-bc09-4fcbd2b3a58c", 00:23:33.725 "is_configured": true, 00:23:33.725 "data_offset": 256, 00:23:33.725 "data_size": 7936 00:23:33.725 } 00:23:33.725 ] 00:23:33.725 }' 00:23:33.725 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:23:33.725 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.725 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@783 -- # killprocess 3945782 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@946 -- # '[' -z 3945782 ']' 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # kill -0 3945782 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # uname 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3945782 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3945782' 00:23:33.983 killing process with pid 3945782 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@965 -- # kill 3945782 00:23:33.983 Received shutdown signal, test time was about 60.000000 seconds 00:23:33.983 00:23:33.983 Latency(us) 00:23:33.983 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:33.983 =================================================================================================================== 00:23:33.983 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:33.983 [2024-05-15 04:24:21.770537] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:33.983 04:24:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@970 -- # wait 3945782 00:23:33.983 [2024-05-15 04:24:21.770663] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:33.983 [2024-05-15 04:24:21.770735] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:33.983 [2024-05-15 04:24:21.770752] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26deb00 name raid_bdev1, state offline 00:23:33.983 [2024-05-15 04:24:21.815887] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:34.241 04:24:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@785 -- # return 0 00:23:34.241 00:23:34.241 real 0m31.397s 00:23:34.241 user 0m49.607s 00:23:34.241 sys 0m4.160s 00:23:34.241 04:24:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:34.241 04:24:22 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:23:34.241 ************************************ 00:23:34.241 END TEST raid_rebuild_test_sb_md_separate 00:23:34.241 ************************************ 00:23:34.241 04:24:22 bdev_raid -- bdev/bdev_raid.sh@845 -- # base_malloc_params='-m 32 -i' 00:23:34.241 04:24:22 bdev_raid -- bdev/bdev_raid.sh@846 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:23:34.241 04:24:22 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:23:34.241 04:24:22 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:34.241 04:24:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:34.241 ************************************ 00:23:34.241 START TEST raid_state_function_test_sb_md_interleaved 00:23:34.241 ************************************ 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_state_function_test raid1 2 true 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local raid_level=raid1 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local num_base_bdevs=2 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local superblock=true 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local raid_bdev 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i = 1 )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev1 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # echo BaseBdev2 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i++ )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # (( i <= num_base_bdevs )) 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local base_bdevs 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local raid_bdev_name=Existed_Raid 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local strip_size_create_arg 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@229 -- # local superblock_create_arg 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@231 -- # '[' raid1 '!=' raid1 ']' 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@235 -- # strip_size=0 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # '[' true = true ']' 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@239 -- # superblock_create_arg=-s 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # raid_pid=3950488 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # echo 'Process raid pid: 3950488' 00:23:34.241 Process raid pid: 3950488 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@247 -- # waitforlisten 3950488 /var/tmp/spdk-raid.sock 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 3950488 ']' 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:34.241 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:34.241 04:24:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:34.241 [2024-05-15 04:24:22.198568] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:23:34.241 [2024-05-15 04:24:22.198656] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:34.499 [2024-05-15 04:24:22.278529] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:34.499 [2024-05-15 04:24:22.397429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:34.499 [2024-05-15 04:24:22.465246] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:34.499 [2024-05-15 04:24:22.465288] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:35.434 [2024-05-15 04:24:23.380381] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:35.434 [2024-05-15 04:24:23.380421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:35.434 [2024-05-15 04:24:23.380447] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:35.434 [2024-05-15 04:24:23.380458] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:35.434 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:35.435 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:35.435 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:35.435 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:35.435 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.435 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:35.692 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:35.692 "name": "Existed_Raid", 00:23:35.692 "uuid": "3dd63b8d-a0af-401a-8d2c-432ede42b4d4", 00:23:35.692 "strip_size_kb": 0, 00:23:35.692 "state": "configuring", 00:23:35.692 "raid_level": "raid1", 00:23:35.692 "superblock": true, 00:23:35.692 "num_base_bdevs": 2, 00:23:35.692 "num_base_bdevs_discovered": 0, 00:23:35.692 "num_base_bdevs_operational": 2, 00:23:35.692 "base_bdevs_list": [ 00:23:35.692 { 00:23:35.692 "name": "BaseBdev1", 00:23:35.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.692 "is_configured": false, 00:23:35.692 "data_offset": 0, 00:23:35.692 "data_size": 0 00:23:35.692 }, 00:23:35.692 { 00:23:35.692 "name": "BaseBdev2", 00:23:35.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.692 "is_configured": false, 00:23:35.692 "data_offset": 0, 00:23:35.692 "data_size": 0 00:23:35.692 } 00:23:35.692 ] 00:23:35.692 }' 00:23:35.692 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:35.692 04:24:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:36.257 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@253 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:36.516 [2024-05-15 04:24:24.398959] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:36.516 [2024-05-15 04:24:24.398993] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0c000 name Existed_Raid, state configuring 00:23:36.516 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:36.774 [2024-05-15 04:24:24.651639] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:36.774 [2024-05-15 04:24:24.651679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:36.774 [2024-05-15 04:24:24.651703] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:36.774 [2024-05-15 04:24:24.651714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:36.774 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:23:37.032 [2024-05-15 04:24:24.904679] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:37.032 BaseBdev1 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # waitforbdev BaseBdev1 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev1 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:37.032 04:24:24 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:37.289 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:37.547 [ 00:23:37.547 { 00:23:37.547 "name": "BaseBdev1", 00:23:37.547 "aliases": [ 00:23:37.547 "82d53929-efcd-4772-a29c-413b696e4e65" 00:23:37.547 ], 00:23:37.547 "product_name": "Malloc disk", 00:23:37.547 "block_size": 4128, 00:23:37.547 "num_blocks": 8192, 00:23:37.547 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:37.547 "md_size": 32, 00:23:37.547 "md_interleave": true, 00:23:37.547 "dif_type": 0, 00:23:37.547 "assigned_rate_limits": { 00:23:37.547 "rw_ios_per_sec": 0, 00:23:37.547 "rw_mbytes_per_sec": 0, 00:23:37.547 "r_mbytes_per_sec": 0, 00:23:37.547 "w_mbytes_per_sec": 0 00:23:37.547 }, 00:23:37.547 "claimed": true, 00:23:37.547 "claim_type": "exclusive_write", 00:23:37.547 "zoned": false, 00:23:37.547 "supported_io_types": { 00:23:37.547 "read": true, 00:23:37.547 "write": true, 00:23:37.547 "unmap": true, 00:23:37.547 "write_zeroes": true, 00:23:37.547 "flush": true, 00:23:37.547 "reset": true, 00:23:37.547 "compare": false, 00:23:37.547 "compare_and_write": false, 00:23:37.547 "abort": true, 00:23:37.547 "nvme_admin": false, 00:23:37.547 "nvme_io": false 00:23:37.547 }, 00:23:37.547 "memory_domains": [ 00:23:37.547 { 00:23:37.547 "dma_device_id": "system", 00:23:37.547 "dma_device_type": 1 00:23:37.547 }, 00:23:37.547 { 00:23:37.547 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:37.547 "dma_device_type": 2 00:23:37.547 } 00:23:37.547 ], 00:23:37.547 "driver_specific": {} 00:23:37.547 } 00:23:37.547 ] 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.547 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:37.805 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:37.805 "name": "Existed_Raid", 00:23:37.805 "uuid": "011a6e63-c8e0-41fc-af7e-eb8a380ba0b4", 00:23:37.805 "strip_size_kb": 0, 00:23:37.805 "state": "configuring", 00:23:37.805 "raid_level": "raid1", 00:23:37.805 "superblock": true, 00:23:37.805 "num_base_bdevs": 2, 00:23:37.805 "num_base_bdevs_discovered": 1, 00:23:37.805 "num_base_bdevs_operational": 2, 00:23:37.805 "base_bdevs_list": [ 00:23:37.805 { 00:23:37.805 "name": "BaseBdev1", 00:23:37.805 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:37.805 "is_configured": true, 00:23:37.805 "data_offset": 256, 00:23:37.805 "data_size": 7936 00:23:37.805 }, 00:23:37.805 { 00:23:37.805 "name": "BaseBdev2", 00:23:37.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:37.805 "is_configured": false, 00:23:37.805 "data_offset": 0, 00:23:37.805 "data_size": 0 00:23:37.805 } 00:23:37.805 ] 00:23:37.805 }' 00:23:37.805 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:37.805 04:24:25 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:38.371 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@261 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:38.629 [2024-05-15 04:24:26.524988] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:38.629 [2024-05-15 04:24:26.525040] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0b8f0 name Existed_Raid, state configuring 00:23:38.629 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:23:38.888 [2024-05-15 04:24:26.777695] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:38.888 [2024-05-15 04:24:26.779250] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:38.888 [2024-05-15 04:24:26.779284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i = 1 )) 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.888 04:24:26 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:39.146 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:39.146 "name": "Existed_Raid", 00:23:39.146 "uuid": "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5", 00:23:39.146 "strip_size_kb": 0, 00:23:39.146 "state": "configuring", 00:23:39.146 "raid_level": "raid1", 00:23:39.146 "superblock": true, 00:23:39.146 "num_base_bdevs": 2, 00:23:39.146 "num_base_bdevs_discovered": 1, 00:23:39.146 "num_base_bdevs_operational": 2, 00:23:39.146 "base_bdevs_list": [ 00:23:39.146 { 00:23:39.146 "name": "BaseBdev1", 00:23:39.146 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:39.146 "is_configured": true, 00:23:39.146 "data_offset": 256, 00:23:39.146 "data_size": 7936 00:23:39.146 }, 00:23:39.146 { 00:23:39.146 "name": "BaseBdev2", 00:23:39.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:39.146 "is_configured": false, 00:23:39.146 "data_offset": 0, 00:23:39.146 "data_size": 0 00:23:39.146 } 00:23:39.146 ] 00:23:39.146 }' 00:23:39.146 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:39.146 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:39.712 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:23:39.971 [2024-05-15 04:24:27.870375] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:39.971 [2024-05-15 04:24:27.870578] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a0d750 00:23:39.971 [2024-05-15 04:24:27.870597] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:39.971 [2024-05-15 04:24:27.870663] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a0b110 00:23:39.971 [2024-05-15 04:24:27.870757] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a0d750 00:23:39.971 [2024-05-15 04:24:27.870773] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a0d750 00:23:39.971 [2024-05-15 04:24:27.870863] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.971 BaseBdev2 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@269 -- # waitforbdev BaseBdev2 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@895 -- # local bdev_name=BaseBdev2 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local i 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:23:39.971 04:24:27 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:40.229 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:40.489 [ 00:23:40.489 { 00:23:40.489 "name": "BaseBdev2", 00:23:40.489 "aliases": [ 00:23:40.489 "1fc9ed77-2989-40a6-910f-f26219045dab" 00:23:40.489 ], 00:23:40.489 "product_name": "Malloc disk", 00:23:40.489 "block_size": 4128, 00:23:40.489 "num_blocks": 8192, 00:23:40.489 "uuid": "1fc9ed77-2989-40a6-910f-f26219045dab", 00:23:40.489 "md_size": 32, 00:23:40.489 "md_interleave": true, 00:23:40.489 "dif_type": 0, 00:23:40.489 "assigned_rate_limits": { 00:23:40.489 "rw_ios_per_sec": 0, 00:23:40.489 "rw_mbytes_per_sec": 0, 00:23:40.489 "r_mbytes_per_sec": 0, 00:23:40.489 "w_mbytes_per_sec": 0 00:23:40.489 }, 00:23:40.489 "claimed": true, 00:23:40.489 "claim_type": "exclusive_write", 00:23:40.489 "zoned": false, 00:23:40.489 "supported_io_types": { 00:23:40.489 "read": true, 00:23:40.489 "write": true, 00:23:40.489 "unmap": true, 00:23:40.489 "write_zeroes": true, 00:23:40.489 "flush": true, 00:23:40.489 "reset": true, 00:23:40.489 "compare": false, 00:23:40.489 "compare_and_write": false, 00:23:40.489 "abort": true, 00:23:40.489 "nvme_admin": false, 00:23:40.489 "nvme_io": false 00:23:40.489 }, 00:23:40.489 "memory_domains": [ 00:23:40.489 { 00:23:40.489 "dma_device_id": "system", 00:23:40.489 "dma_device_type": 1 00:23:40.489 }, 00:23:40.489 { 00:23:40.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:40.489 "dma_device_type": 2 00:23:40.489 } 00:23:40.489 ], 00:23:40.489 "driver_specific": {} 00:23:40.489 } 00:23:40.489 ] 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # return 0 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i++ )) 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # (( i < num_base_bdevs )) 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.489 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:40.748 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:40.748 "name": "Existed_Raid", 00:23:40.748 "uuid": "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5", 00:23:40.748 "strip_size_kb": 0, 00:23:40.748 "state": "online", 00:23:40.748 "raid_level": "raid1", 00:23:40.748 "superblock": true, 00:23:40.748 "num_base_bdevs": 2, 00:23:40.748 "num_base_bdevs_discovered": 2, 00:23:40.748 "num_base_bdevs_operational": 2, 00:23:40.748 "base_bdevs_list": [ 00:23:40.748 { 00:23:40.748 "name": "BaseBdev1", 00:23:40.748 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:40.748 "is_configured": true, 00:23:40.748 "data_offset": 256, 00:23:40.748 "data_size": 7936 00:23:40.748 }, 00:23:40.748 { 00:23:40.748 "name": "BaseBdev2", 00:23:40.748 "uuid": "1fc9ed77-2989-40a6-910f-f26219045dab", 00:23:40.748 "is_configured": true, 00:23:40.748 "data_offset": 256, 00:23:40.748 "data_size": 7936 00:23:40.748 } 00:23:40.748 ] 00:23:40.748 }' 00:23:40.748 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:40.748 04:24:28 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@272 -- # verify_raid_bdev_properties Existed_Raid 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=Existed_Raid 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:41.313 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:41.570 [2024-05-15 04:24:29.338461] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:41.570 "name": "Existed_Raid", 00:23:41.570 "aliases": [ 00:23:41.570 "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5" 00:23:41.570 ], 00:23:41.570 "product_name": "Raid Volume", 00:23:41.570 "block_size": 4128, 00:23:41.570 "num_blocks": 7936, 00:23:41.570 "uuid": "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5", 00:23:41.570 "md_size": 32, 00:23:41.570 "md_interleave": true, 00:23:41.570 "dif_type": 0, 00:23:41.570 "assigned_rate_limits": { 00:23:41.570 "rw_ios_per_sec": 0, 00:23:41.570 "rw_mbytes_per_sec": 0, 00:23:41.570 "r_mbytes_per_sec": 0, 00:23:41.570 "w_mbytes_per_sec": 0 00:23:41.570 }, 00:23:41.570 "claimed": false, 00:23:41.570 "zoned": false, 00:23:41.570 "supported_io_types": { 00:23:41.570 "read": true, 00:23:41.570 "write": true, 00:23:41.570 "unmap": false, 00:23:41.570 "write_zeroes": true, 00:23:41.570 "flush": false, 00:23:41.570 "reset": true, 00:23:41.570 "compare": false, 00:23:41.570 "compare_and_write": false, 00:23:41.570 "abort": false, 00:23:41.570 "nvme_admin": false, 00:23:41.570 "nvme_io": false 00:23:41.570 }, 00:23:41.570 "memory_domains": [ 00:23:41.570 { 00:23:41.570 "dma_device_id": "system", 00:23:41.570 "dma_device_type": 1 00:23:41.570 }, 00:23:41.570 { 00:23:41.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.570 "dma_device_type": 2 00:23:41.570 }, 00:23:41.570 { 00:23:41.570 "dma_device_id": "system", 00:23:41.570 "dma_device_type": 1 00:23:41.570 }, 00:23:41.570 { 00:23:41.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.570 "dma_device_type": 2 00:23:41.570 } 00:23:41.570 ], 00:23:41.570 "driver_specific": { 00:23:41.570 "raid": { 00:23:41.570 "uuid": "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5", 00:23:41.570 "strip_size_kb": 0, 00:23:41.570 "state": "online", 00:23:41.570 "raid_level": "raid1", 00:23:41.570 "superblock": true, 00:23:41.570 "num_base_bdevs": 2, 00:23:41.570 "num_base_bdevs_discovered": 2, 00:23:41.570 "num_base_bdevs_operational": 2, 00:23:41.570 "base_bdevs_list": [ 00:23:41.570 { 00:23:41.570 "name": "BaseBdev1", 00:23:41.570 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:41.570 "is_configured": true, 00:23:41.570 "data_offset": 256, 00:23:41.570 "data_size": 7936 00:23:41.570 }, 00:23:41.570 { 00:23:41.570 "name": "BaseBdev2", 00:23:41.570 "uuid": "1fc9ed77-2989-40a6-910f-f26219045dab", 00:23:41.570 "is_configured": true, 00:23:41.570 "data_offset": 256, 00:23:41.570 "data_size": 7936 00:23:41.570 } 00:23:41.570 ] 00:23:41.570 } 00:23:41.570 } 00:23:41.570 }' 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='BaseBdev1 00:23:41.570 BaseBdev2' 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:41.570 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:41.828 "name": "BaseBdev1", 00:23:41.828 "aliases": [ 00:23:41.828 "82d53929-efcd-4772-a29c-413b696e4e65" 00:23:41.828 ], 00:23:41.828 "product_name": "Malloc disk", 00:23:41.828 "block_size": 4128, 00:23:41.828 "num_blocks": 8192, 00:23:41.828 "uuid": "82d53929-efcd-4772-a29c-413b696e4e65", 00:23:41.828 "md_size": 32, 00:23:41.828 "md_interleave": true, 00:23:41.828 "dif_type": 0, 00:23:41.828 "assigned_rate_limits": { 00:23:41.828 "rw_ios_per_sec": 0, 00:23:41.828 "rw_mbytes_per_sec": 0, 00:23:41.828 "r_mbytes_per_sec": 0, 00:23:41.828 "w_mbytes_per_sec": 0 00:23:41.828 }, 00:23:41.828 "claimed": true, 00:23:41.828 "claim_type": "exclusive_write", 00:23:41.828 "zoned": false, 00:23:41.828 "supported_io_types": { 00:23:41.828 "read": true, 00:23:41.828 "write": true, 00:23:41.828 "unmap": true, 00:23:41.828 "write_zeroes": true, 00:23:41.828 "flush": true, 00:23:41.828 "reset": true, 00:23:41.828 "compare": false, 00:23:41.828 "compare_and_write": false, 00:23:41.828 "abort": true, 00:23:41.828 "nvme_admin": false, 00:23:41.828 "nvme_io": false 00:23:41.828 }, 00:23:41.828 "memory_domains": [ 00:23:41.828 { 00:23:41.828 "dma_device_id": "system", 00:23:41.828 "dma_device_type": 1 00:23:41.828 }, 00:23:41.828 { 00:23:41.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.828 "dma_device_type": 2 00:23:41.828 } 00:23:41.828 ], 00:23:41.828 "driver_specific": {} 00:23:41.828 }' 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:41.828 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:42.087 04:24:29 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:42.345 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:42.345 "name": "BaseBdev2", 00:23:42.345 "aliases": [ 00:23:42.345 "1fc9ed77-2989-40a6-910f-f26219045dab" 00:23:42.345 ], 00:23:42.345 "product_name": "Malloc disk", 00:23:42.345 "block_size": 4128, 00:23:42.345 "num_blocks": 8192, 00:23:42.345 "uuid": "1fc9ed77-2989-40a6-910f-f26219045dab", 00:23:42.345 "md_size": 32, 00:23:42.345 "md_interleave": true, 00:23:42.345 "dif_type": 0, 00:23:42.345 "assigned_rate_limits": { 00:23:42.345 "rw_ios_per_sec": 0, 00:23:42.345 "rw_mbytes_per_sec": 0, 00:23:42.345 "r_mbytes_per_sec": 0, 00:23:42.345 "w_mbytes_per_sec": 0 00:23:42.345 }, 00:23:42.345 "claimed": true, 00:23:42.345 "claim_type": "exclusive_write", 00:23:42.345 "zoned": false, 00:23:42.345 "supported_io_types": { 00:23:42.345 "read": true, 00:23:42.345 "write": true, 00:23:42.345 "unmap": true, 00:23:42.345 "write_zeroes": true, 00:23:42.345 "flush": true, 00:23:42.345 "reset": true, 00:23:42.345 "compare": false, 00:23:42.345 "compare_and_write": false, 00:23:42.345 "abort": true, 00:23:42.345 "nvme_admin": false, 00:23:42.345 "nvme_io": false 00:23:42.345 }, 00:23:42.345 "memory_domains": [ 00:23:42.345 { 00:23:42.345 "dma_device_id": "system", 00:23:42.345 "dma_device_type": 1 00:23:42.345 }, 00:23:42.345 { 00:23:42.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:42.345 "dma_device_type": 2 00:23:42.346 } 00:23:42.346 ], 00:23:42.346 "driver_specific": {} 00:23:42.346 }' 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:42.346 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:42.604 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:42.862 [2024-05-15 04:24:30.689837] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # local expected_state 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@277 -- # has_redundancy raid1 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@280 -- # expected_state=online 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@282 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=Existed_Raid 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.862 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:43.120 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:43.120 "name": "Existed_Raid", 00:23:43.120 "uuid": "29c2c38a-f3be-4218-9e2e-b5c06cd3ffa5", 00:23:43.120 "strip_size_kb": 0, 00:23:43.120 "state": "online", 00:23:43.120 "raid_level": "raid1", 00:23:43.120 "superblock": true, 00:23:43.120 "num_base_bdevs": 2, 00:23:43.120 "num_base_bdevs_discovered": 1, 00:23:43.120 "num_base_bdevs_operational": 1, 00:23:43.120 "base_bdevs_list": [ 00:23:43.120 { 00:23:43.120 "name": null, 00:23:43.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.120 "is_configured": false, 00:23:43.120 "data_offset": 256, 00:23:43.120 "data_size": 7936 00:23:43.120 }, 00:23:43.120 { 00:23:43.120 "name": "BaseBdev2", 00:23:43.120 "uuid": "1fc9ed77-2989-40a6-910f-f26219045dab", 00:23:43.120 "is_configured": true, 00:23:43.120 "data_offset": 256, 00:23:43.120 "data_size": 7936 00:23:43.120 } 00:23:43.120 ] 00:23:43.120 }' 00:23:43.120 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:43.120 04:24:30 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:43.686 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i = 1 )) 00:23:43.686 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:43.686 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.686 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # jq -r '.[0]["name"]' 00:23:43.944 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # raid_bdev=Existed_Raid 00:23:43.944 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@288 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:43.944 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@292 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:43.944 [2024-05-15 04:24:31.930602] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:43.944 [2024-05-15 04:24:31.930693] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:43.944 [2024-05-15 04:24:31.943289] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:43.944 [2024-05-15 04:24:31.943353] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:43.944 [2024-05-15 04:24:31.943367] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a0d750 name Existed_Raid, state offline 00:23:43.944 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i++ )) 00:23:43.944 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # (( i < num_base_bdevs )) 00:23:44.202 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.202 04:24:31 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # jq -r '.[0]["name"] | select(.)' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # raid_bdev= 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@295 -- # '[' -n '' ']' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@300 -- # '[' 2 -gt 2 ']' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@342 -- # killprocess 3950488 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 3950488 ']' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 3950488 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3950488 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3950488' 00:23:44.202 killing process with pid 3950488 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 3950488 00:23:44.202 [2024-05-15 04:24:32.216300] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:44.202 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 3950488 00:23:44.202 [2024-05-15 04:24:32.217464] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:44.460 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@344 -- # return 0 00:23:44.460 00:23:44.460 real 0m10.314s 00:23:44.460 user 0m18.684s 00:23:44.460 sys 0m1.460s 00:23:44.460 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:23:44.460 04:24:32 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:44.460 ************************************ 00:23:44.460 END TEST raid_state_function_test_sb_md_interleaved 00:23:44.460 ************************************ 00:23:44.718 04:24:32 bdev_raid -- bdev/bdev_raid.sh@847 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:23:44.718 04:24:32 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:23:44.718 04:24:32 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:23:44.718 04:24:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:44.718 ************************************ 00:23:44.718 START TEST raid_superblock_test_md_interleaved 00:23:44.718 ************************************ 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1121 -- # raid_superblock_test raid1 2 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local raid_level=raid1 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local num_base_bdevs=2 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_malloc=() 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_malloc 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt=() 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # base_bdevs_pt_uuid=() 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local base_bdevs_pt_uuid 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local raid_bdev_name=raid_bdev1 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local strip_size_create_arg 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev_uuid 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@402 -- # local raid_bdev 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@404 -- # '[' raid1 '!=' raid1 ']' 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # strip_size=0 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # raid_pid=3951985 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # waitforlisten 3951985 /var/tmp/spdk-raid.sock 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 3951985 ']' 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:44.718 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:23:44.718 04:24:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:44.718 [2024-05-15 04:24:32.563579] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:23:44.718 [2024-05-15 04:24:32.563664] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3951985 ] 00:23:44.718 [2024-05-15 04:24:32.640631] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:44.975 [2024-05-15 04:24:32.751685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:23:44.975 [2024-05-15 04:24:32.823162] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:44.975 [2024-05-15 04:24:32.823210] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i = 1 )) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc1 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt1 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:45.541 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:23:45.798 malloc1 00:23:45.798 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:46.056 [2024-05-15 04:24:33.983700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:46.056 [2024-05-15 04:24:33.983770] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.056 [2024-05-15 04:24:33.983807] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1314ba0 00:23:46.056 [2024-05-15 04:24:33.983847] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.056 [2024-05-15 04:24:33.985394] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.056 [2024-05-15 04:24:33.985421] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:46.056 pt1 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_malloc=malloc2 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt=pt2 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt+=($bdev_pt) 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:46.056 04:24:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:23:46.313 malloc2 00:23:46.313 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:46.571 [2024-05-15 04:24:34.481231] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:46.571 [2024-05-15 04:24:34.481291] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.571 [2024-05-15 04:24:34.481324] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x130b180 00:23:46.571 [2024-05-15 04:24:34.481340] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.571 [2024-05-15 04:24:34.483122] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.571 [2024-05-15 04:24:34.483149] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:46.571 pt2 00:23:46.571 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i++ )) 00:23:46.571 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # (( i <= num_base_bdevs )) 00:23:46.571 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:23:46.829 [2024-05-15 04:24:34.725910] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:46.829 [2024-05-15 04:24:34.727424] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:46.829 [2024-05-15 04:24:34.727628] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x130d8f0 00:23:46.829 [2024-05-15 04:24:34.727646] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:46.829 [2024-05-15 04:24:34.727744] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x130e840 00:23:46.829 [2024-05-15 04:24:34.727881] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x130d8f0 00:23:46.829 [2024-05-15 04:24:34.727897] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x130d8f0 00:23:46.829 [2024-05-15 04:24:34.727981] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.829 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.087 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:47.087 "name": "raid_bdev1", 00:23:47.087 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:47.087 "strip_size_kb": 0, 00:23:47.087 "state": "online", 00:23:47.087 "raid_level": "raid1", 00:23:47.087 "superblock": true, 00:23:47.087 "num_base_bdevs": 2, 00:23:47.087 "num_base_bdevs_discovered": 2, 00:23:47.087 "num_base_bdevs_operational": 2, 00:23:47.087 "base_bdevs_list": [ 00:23:47.087 { 00:23:47.087 "name": "pt1", 00:23:47.087 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:47.087 "is_configured": true, 00:23:47.087 "data_offset": 256, 00:23:47.087 "data_size": 7936 00:23:47.087 }, 00:23:47.087 { 00:23:47.087 "name": "pt2", 00:23:47.087 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:47.087 "is_configured": true, 00:23:47.087 "data_offset": 256, 00:23:47.087 "data_size": 7936 00:23:47.087 } 00:23:47.087 ] 00:23:47.087 }' 00:23:47.087 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:47.087 04:24:34 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # verify_raid_bdev_properties raid_bdev1 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:47.652 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:47.910 [2024-05-15 04:24:35.756923] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:47.910 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:47.911 "name": "raid_bdev1", 00:23:47.911 "aliases": [ 00:23:47.911 "2022e6d9-77f7-4335-b36a-34cda03ec4f7" 00:23:47.911 ], 00:23:47.911 "product_name": "Raid Volume", 00:23:47.911 "block_size": 4128, 00:23:47.911 "num_blocks": 7936, 00:23:47.911 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:47.911 "md_size": 32, 00:23:47.911 "md_interleave": true, 00:23:47.911 "dif_type": 0, 00:23:47.911 "assigned_rate_limits": { 00:23:47.911 "rw_ios_per_sec": 0, 00:23:47.911 "rw_mbytes_per_sec": 0, 00:23:47.911 "r_mbytes_per_sec": 0, 00:23:47.911 "w_mbytes_per_sec": 0 00:23:47.911 }, 00:23:47.911 "claimed": false, 00:23:47.911 "zoned": false, 00:23:47.911 "supported_io_types": { 00:23:47.911 "read": true, 00:23:47.911 "write": true, 00:23:47.911 "unmap": false, 00:23:47.911 "write_zeroes": true, 00:23:47.911 "flush": false, 00:23:47.911 "reset": true, 00:23:47.911 "compare": false, 00:23:47.911 "compare_and_write": false, 00:23:47.911 "abort": false, 00:23:47.911 "nvme_admin": false, 00:23:47.911 "nvme_io": false 00:23:47.911 }, 00:23:47.911 "memory_domains": [ 00:23:47.911 { 00:23:47.911 "dma_device_id": "system", 00:23:47.911 "dma_device_type": 1 00:23:47.911 }, 00:23:47.911 { 00:23:47.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:47.911 "dma_device_type": 2 00:23:47.911 }, 00:23:47.911 { 00:23:47.911 "dma_device_id": "system", 00:23:47.911 "dma_device_type": 1 00:23:47.911 }, 00:23:47.911 { 00:23:47.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:47.911 "dma_device_type": 2 00:23:47.911 } 00:23:47.911 ], 00:23:47.911 "driver_specific": { 00:23:47.911 "raid": { 00:23:47.911 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:47.911 "strip_size_kb": 0, 00:23:47.911 "state": "online", 00:23:47.911 "raid_level": "raid1", 00:23:47.911 "superblock": true, 00:23:47.911 "num_base_bdevs": 2, 00:23:47.911 "num_base_bdevs_discovered": 2, 00:23:47.911 "num_base_bdevs_operational": 2, 00:23:47.911 "base_bdevs_list": [ 00:23:47.911 { 00:23:47.911 "name": "pt1", 00:23:47.911 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:47.911 "is_configured": true, 00:23:47.911 "data_offset": 256, 00:23:47.911 "data_size": 7936 00:23:47.911 }, 00:23:47.911 { 00:23:47.911 "name": "pt2", 00:23:47.911 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:47.911 "is_configured": true, 00:23:47.911 "data_offset": 256, 00:23:47.911 "data_size": 7936 00:23:47.911 } 00:23:47.911 ] 00:23:47.911 } 00:23:47.911 } 00:23:47.911 }' 00:23:47.911 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:47.911 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:47.911 pt2' 00:23:47.911 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:47.911 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:47.911 04:24:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:48.169 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:48.169 "name": "pt1", 00:23:48.169 "aliases": [ 00:23:48.169 "844a193e-69d6-5982-9e2e-37234b53c2f9" 00:23:48.169 ], 00:23:48.169 "product_name": "passthru", 00:23:48.169 "block_size": 4128, 00:23:48.169 "num_blocks": 8192, 00:23:48.169 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:48.169 "md_size": 32, 00:23:48.169 "md_interleave": true, 00:23:48.169 "dif_type": 0, 00:23:48.169 "assigned_rate_limits": { 00:23:48.169 "rw_ios_per_sec": 0, 00:23:48.169 "rw_mbytes_per_sec": 0, 00:23:48.169 "r_mbytes_per_sec": 0, 00:23:48.169 "w_mbytes_per_sec": 0 00:23:48.169 }, 00:23:48.169 "claimed": true, 00:23:48.169 "claim_type": "exclusive_write", 00:23:48.169 "zoned": false, 00:23:48.169 "supported_io_types": { 00:23:48.169 "read": true, 00:23:48.169 "write": true, 00:23:48.169 "unmap": true, 00:23:48.169 "write_zeroes": true, 00:23:48.169 "flush": true, 00:23:48.169 "reset": true, 00:23:48.170 "compare": false, 00:23:48.170 "compare_and_write": false, 00:23:48.170 "abort": true, 00:23:48.170 "nvme_admin": false, 00:23:48.170 "nvme_io": false 00:23:48.170 }, 00:23:48.170 "memory_domains": [ 00:23:48.170 { 00:23:48.170 "dma_device_id": "system", 00:23:48.170 "dma_device_type": 1 00:23:48.170 }, 00:23:48.170 { 00:23:48.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.170 "dma_device_type": 2 00:23:48.170 } 00:23:48.170 ], 00:23:48.170 "driver_specific": { 00:23:48.170 "passthru": { 00:23:48.170 "name": "pt1", 00:23:48.170 "base_bdev_name": "malloc1" 00:23:48.170 } 00:23:48.170 } 00:23:48.170 }' 00:23:48.170 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:48.170 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:48.170 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:48.170 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:48.170 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:48.428 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:48.686 "name": "pt2", 00:23:48.686 "aliases": [ 00:23:48.686 "6e0f4a04-1433-5ee1-9351-527042c5e769" 00:23:48.686 ], 00:23:48.686 "product_name": "passthru", 00:23:48.686 "block_size": 4128, 00:23:48.686 "num_blocks": 8192, 00:23:48.686 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:48.686 "md_size": 32, 00:23:48.686 "md_interleave": true, 00:23:48.686 "dif_type": 0, 00:23:48.686 "assigned_rate_limits": { 00:23:48.686 "rw_ios_per_sec": 0, 00:23:48.686 "rw_mbytes_per_sec": 0, 00:23:48.686 "r_mbytes_per_sec": 0, 00:23:48.686 "w_mbytes_per_sec": 0 00:23:48.686 }, 00:23:48.686 "claimed": true, 00:23:48.686 "claim_type": "exclusive_write", 00:23:48.686 "zoned": false, 00:23:48.686 "supported_io_types": { 00:23:48.686 "read": true, 00:23:48.686 "write": true, 00:23:48.686 "unmap": true, 00:23:48.686 "write_zeroes": true, 00:23:48.686 "flush": true, 00:23:48.686 "reset": true, 00:23:48.686 "compare": false, 00:23:48.686 "compare_and_write": false, 00:23:48.686 "abort": true, 00:23:48.686 "nvme_admin": false, 00:23:48.686 "nvme_io": false 00:23:48.686 }, 00:23:48.686 "memory_domains": [ 00:23:48.686 { 00:23:48.686 "dma_device_id": "system", 00:23:48.686 "dma_device_type": 1 00:23:48.686 }, 00:23:48.686 { 00:23:48.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.686 "dma_device_type": 2 00:23:48.686 } 00:23:48.686 ], 00:23:48.686 "driver_specific": { 00:23:48.686 "passthru": { 00:23:48.686 "name": "pt2", 00:23:48.686 "base_bdev_name": "malloc2" 00:23:48.686 } 00:23:48.686 } 00:23:48.686 }' 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:48.686 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:48.944 04:24:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # jq -r '.[] | .uuid' 00:23:49.202 [2024-05-15 04:24:37.096438] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:49.202 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # raid_bdev_uuid=2022e6d9-77f7-4335-b36a-34cda03ec4f7 00:23:49.202 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # '[' -z 2022e6d9-77f7-4335-b36a-34cda03ec4f7 ']' 00:23:49.202 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:49.460 [2024-05-15 04:24:37.336865] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:49.460 [2024-05-15 04:24:37.336889] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:49.460 [2024-05-15 04:24:37.336958] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:49.460 [2024-05-15 04:24:37.337018] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:49.460 [2024-05-15 04:24:37.337036] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130d8f0 name raid_bdev1, state offline 00:23:49.460 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.460 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # jq -r '.[]' 00:23:49.718 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # raid_bdev= 00:23:49.718 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@443 -- # '[' -n '' ']' 00:23:49.718 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:49.718 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:49.976 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # for i in "${base_bdevs_pt[@]}" 00:23:49.976 04:24:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@449 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:50.233 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:50.233 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' false == true ']' 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:50.491 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:23:50.750 [2024-05-15 04:24:38.552055] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:50.750 [2024-05-15 04:24:38.553388] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:50.750 [2024-05-15 04:24:38.553450] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:50.750 [2024-05-15 04:24:38.553522] bdev_raid.c:3046:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:50.750 [2024-05-15 04:24:38.553545] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:50.750 [2024-05-15 04:24:38.553557] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1177540 name raid_bdev1, state configuring 00:23:50.750 request: 00:23:50.750 { 00:23:50.750 "name": "raid_bdev1", 00:23:50.750 "raid_level": "raid1", 00:23:50.750 "base_bdevs": [ 00:23:50.750 "malloc1", 00:23:50.750 "malloc2" 00:23:50.750 ], 00:23:50.750 "superblock": false, 00:23:50.750 "method": "bdev_raid_create", 00:23:50.750 "req_id": 1 00:23:50.750 } 00:23:50.750 Got JSON-RPC error response 00:23:50.750 response: 00:23:50.750 { 00:23:50.750 "code": -17, 00:23:50.750 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:50.750 } 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.750 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # jq -r '.[]' 00:23:51.008 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # raid_bdev= 00:23:51.008 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@460 -- # '[' -n '' ']' 00:23:51.008 04:24:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@465 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:51.266 [2024-05-15 04:24:39.093447] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:51.266 [2024-05-15 04:24:39.093502] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:51.266 [2024-05-15 04:24:39.093527] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11779a0 00:23:51.266 [2024-05-15 04:24:39.093540] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:51.266 [2024-05-15 04:24:39.094893] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:51.266 [2024-05-15 04:24:39.094916] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:51.266 [2024-05-15 04:24:39.094970] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:51.266 [2024-05-15 04:24:39.095006] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:51.266 pt1 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@468 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=configuring 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.266 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:51.523 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:51.523 "name": "raid_bdev1", 00:23:51.523 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:51.523 "strip_size_kb": 0, 00:23:51.523 "state": "configuring", 00:23:51.523 "raid_level": "raid1", 00:23:51.523 "superblock": true, 00:23:51.523 "num_base_bdevs": 2, 00:23:51.523 "num_base_bdevs_discovered": 1, 00:23:51.523 "num_base_bdevs_operational": 2, 00:23:51.523 "base_bdevs_list": [ 00:23:51.523 { 00:23:51.523 "name": "pt1", 00:23:51.523 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:51.523 "is_configured": true, 00:23:51.523 "data_offset": 256, 00:23:51.523 "data_size": 7936 00:23:51.523 }, 00:23:51.523 { 00:23:51.523 "name": null, 00:23:51.523 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:51.523 "is_configured": false, 00:23:51.523 "data_offset": 256, 00:23:51.523 "data_size": 7936 00:23:51.523 } 00:23:51.523 ] 00:23:51.523 }' 00:23:51.523 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:51.523 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:52.086 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@470 -- # '[' 2 -gt 2 ']' 00:23:52.086 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i = 1 )) 00:23:52.086 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:52.086 04:24:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@479 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:52.343 [2024-05-15 04:24:40.196390] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:52.343 [2024-05-15 04:24:40.196460] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.343 [2024-05-15 04:24:40.196490] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x130cac0 00:23:52.343 [2024-05-15 04:24:40.196505] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.343 [2024-05-15 04:24:40.196726] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.343 [2024-05-15 04:24:40.196750] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:52.343 [2024-05-15 04:24:40.196808] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:52.343 [2024-05-15 04:24:40.196847] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:52.343 [2024-05-15 04:24:40.196963] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x130f8d0 00:23:52.343 [2024-05-15 04:24:40.196980] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:52.343 [2024-05-15 04:24:40.197048] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11785a0 00:23:52.343 [2024-05-15 04:24:40.197148] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x130f8d0 00:23:52.343 [2024-05-15 04:24:40.197165] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x130f8d0 00:23:52.343 [2024-05-15 04:24:40.197239] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.343 pt2 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i++ )) 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # (( i < num_base_bdevs )) 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.343 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.600 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:52.600 "name": "raid_bdev1", 00:23:52.600 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:52.600 "strip_size_kb": 0, 00:23:52.600 "state": "online", 00:23:52.600 "raid_level": "raid1", 00:23:52.600 "superblock": true, 00:23:52.600 "num_base_bdevs": 2, 00:23:52.600 "num_base_bdevs_discovered": 2, 00:23:52.600 "num_base_bdevs_operational": 2, 00:23:52.600 "base_bdevs_list": [ 00:23:52.600 { 00:23:52.600 "name": "pt1", 00:23:52.600 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:52.600 "is_configured": true, 00:23:52.600 "data_offset": 256, 00:23:52.600 "data_size": 7936 00:23:52.600 }, 00:23:52.600 { 00:23:52.600 "name": "pt2", 00:23:52.600 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:52.600 "is_configured": true, 00:23:52.600 "data_offset": 256, 00:23:52.600 "data_size": 7936 00:23:52.600 } 00:23:52.600 ] 00:23:52.600 }' 00:23:52.600 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:52.600 04:24:40 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@484 -- # verify_raid_bdev_properties raid_bdev1 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_name=raid_bdev1 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local raid_bdev_info 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_info 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local base_bdev_names 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@199 -- # local name 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:53.165 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq '.[]' 00:23:53.423 [2024-05-15 04:24:41.263446] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # raid_bdev_info='{ 00:23:53.423 "name": "raid_bdev1", 00:23:53.423 "aliases": [ 00:23:53.423 "2022e6d9-77f7-4335-b36a-34cda03ec4f7" 00:23:53.423 ], 00:23:53.423 "product_name": "Raid Volume", 00:23:53.423 "block_size": 4128, 00:23:53.423 "num_blocks": 7936, 00:23:53.423 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:53.423 "md_size": 32, 00:23:53.423 "md_interleave": true, 00:23:53.423 "dif_type": 0, 00:23:53.423 "assigned_rate_limits": { 00:23:53.423 "rw_ios_per_sec": 0, 00:23:53.423 "rw_mbytes_per_sec": 0, 00:23:53.423 "r_mbytes_per_sec": 0, 00:23:53.423 "w_mbytes_per_sec": 0 00:23:53.423 }, 00:23:53.423 "claimed": false, 00:23:53.423 "zoned": false, 00:23:53.423 "supported_io_types": { 00:23:53.423 "read": true, 00:23:53.423 "write": true, 00:23:53.423 "unmap": false, 00:23:53.423 "write_zeroes": true, 00:23:53.423 "flush": false, 00:23:53.423 "reset": true, 00:23:53.423 "compare": false, 00:23:53.423 "compare_and_write": false, 00:23:53.423 "abort": false, 00:23:53.423 "nvme_admin": false, 00:23:53.423 "nvme_io": false 00:23:53.423 }, 00:23:53.423 "memory_domains": [ 00:23:53.423 { 00:23:53.423 "dma_device_id": "system", 00:23:53.423 "dma_device_type": 1 00:23:53.423 }, 00:23:53.423 { 00:23:53.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:53.423 "dma_device_type": 2 00:23:53.423 }, 00:23:53.423 { 00:23:53.423 "dma_device_id": "system", 00:23:53.423 "dma_device_type": 1 00:23:53.423 }, 00:23:53.423 { 00:23:53.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:53.423 "dma_device_type": 2 00:23:53.423 } 00:23:53.423 ], 00:23:53.423 "driver_specific": { 00:23:53.423 "raid": { 00:23:53.423 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:53.423 "strip_size_kb": 0, 00:23:53.423 "state": "online", 00:23:53.423 "raid_level": "raid1", 00:23:53.423 "superblock": true, 00:23:53.423 "num_base_bdevs": 2, 00:23:53.423 "num_base_bdevs_discovered": 2, 00:23:53.423 "num_base_bdevs_operational": 2, 00:23:53.423 "base_bdevs_list": [ 00:23:53.423 { 00:23:53.423 "name": "pt1", 00:23:53.423 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:53.423 "is_configured": true, 00:23:53.423 "data_offset": 256, 00:23:53.423 "data_size": 7936 00:23:53.423 }, 00:23:53.423 { 00:23:53.423 "name": "pt2", 00:23:53.423 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:53.423 "is_configured": true, 00:23:53.423 "data_offset": 256, 00:23:53.423 "data_size": 7936 00:23:53.423 } 00:23:53.423 ] 00:23:53.423 } 00:23:53.423 } 00:23:53.423 }' 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@202 -- # base_bdev_names='pt1 00:23:53.423 pt2' 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:53.423 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:53.681 "name": "pt1", 00:23:53.681 "aliases": [ 00:23:53.681 "844a193e-69d6-5982-9e2e-37234b53c2f9" 00:23:53.681 ], 00:23:53.681 "product_name": "passthru", 00:23:53.681 "block_size": 4128, 00:23:53.681 "num_blocks": 8192, 00:23:53.681 "uuid": "844a193e-69d6-5982-9e2e-37234b53c2f9", 00:23:53.681 "md_size": 32, 00:23:53.681 "md_interleave": true, 00:23:53.681 "dif_type": 0, 00:23:53.681 "assigned_rate_limits": { 00:23:53.681 "rw_ios_per_sec": 0, 00:23:53.681 "rw_mbytes_per_sec": 0, 00:23:53.681 "r_mbytes_per_sec": 0, 00:23:53.681 "w_mbytes_per_sec": 0 00:23:53.681 }, 00:23:53.681 "claimed": true, 00:23:53.681 "claim_type": "exclusive_write", 00:23:53.681 "zoned": false, 00:23:53.681 "supported_io_types": { 00:23:53.681 "read": true, 00:23:53.681 "write": true, 00:23:53.681 "unmap": true, 00:23:53.681 "write_zeroes": true, 00:23:53.681 "flush": true, 00:23:53.681 "reset": true, 00:23:53.681 "compare": false, 00:23:53.681 "compare_and_write": false, 00:23:53.681 "abort": true, 00:23:53.681 "nvme_admin": false, 00:23:53.681 "nvme_io": false 00:23:53.681 }, 00:23:53.681 "memory_domains": [ 00:23:53.681 { 00:23:53.681 "dma_device_id": "system", 00:23:53.681 "dma_device_type": 1 00:23:53.681 }, 00:23:53.681 { 00:23:53.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:53.681 "dma_device_type": 2 00:23:53.681 } 00:23:53.681 ], 00:23:53.681 "driver_specific": { 00:23:53.681 "passthru": { 00:23:53.681 "name": "pt1", 00:23:53.681 "base_bdev_name": "malloc1" 00:23:53.681 } 00:23:53.681 } 00:23:53.681 }' 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.681 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # for name in $base_bdev_names 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:53.939 04:24:41 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq '.[]' 00:23:54.197 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # base_bdev_info='{ 00:23:54.197 "name": "pt2", 00:23:54.197 "aliases": [ 00:23:54.197 "6e0f4a04-1433-5ee1-9351-527042c5e769" 00:23:54.197 ], 00:23:54.197 "product_name": "passthru", 00:23:54.197 "block_size": 4128, 00:23:54.197 "num_blocks": 8192, 00:23:54.197 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:54.197 "md_size": 32, 00:23:54.197 "md_interleave": true, 00:23:54.197 "dif_type": 0, 00:23:54.197 "assigned_rate_limits": { 00:23:54.197 "rw_ios_per_sec": 0, 00:23:54.197 "rw_mbytes_per_sec": 0, 00:23:54.197 "r_mbytes_per_sec": 0, 00:23:54.197 "w_mbytes_per_sec": 0 00:23:54.197 }, 00:23:54.197 "claimed": true, 00:23:54.197 "claim_type": "exclusive_write", 00:23:54.197 "zoned": false, 00:23:54.197 "supported_io_types": { 00:23:54.197 "read": true, 00:23:54.197 "write": true, 00:23:54.197 "unmap": true, 00:23:54.197 "write_zeroes": true, 00:23:54.197 "flush": true, 00:23:54.197 "reset": true, 00:23:54.197 "compare": false, 00:23:54.197 "compare_and_write": false, 00:23:54.197 "abort": true, 00:23:54.197 "nvme_admin": false, 00:23:54.197 "nvme_io": false 00:23:54.197 }, 00:23:54.197 "memory_domains": [ 00:23:54.197 { 00:23:54.197 "dma_device_id": "system", 00:23:54.197 "dma_device_type": 1 00:23:54.197 }, 00:23:54.197 { 00:23:54.197 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:54.197 "dma_device_type": 2 00:23:54.197 } 00:23:54.197 ], 00:23:54.197 "driver_specific": { 00:23:54.197 "passthru": { 00:23:54.197 "name": "pt2", 00:23:54.197 "base_bdev_name": "malloc2" 00:23:54.197 } 00:23:54.197 } 00:23:54.197 }' 00:23:54.197 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:54.197 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .block_size 00:23:54.197 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 4128 == 4128 ]] 00:23:54.197 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_size 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ 32 == 32 ]] 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .md_interleave 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ true == true ]] 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # jq .dif_type 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@209 -- # [[ 0 == 0 ]] 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:54.455 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # jq -r '.[] | .uuid' 00:23:54.737 [2024-05-15 04:24:42.675207] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:54.738 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@487 -- # '[' 2022e6d9-77f7-4335-b36a-34cda03ec4f7 '!=' 2022e6d9-77f7-4335-b36a-34cda03ec4f7 ']' 00:23:54.738 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@491 -- # has_redundancy raid1 00:23:54.738 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # case $1 in 00:23:54.738 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@215 -- # return 0 00:23:54.738 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:55.041 [2024-05-15 04:24:42.947756] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@496 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.041 04:24:42 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.321 04:24:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:55.321 "name": "raid_bdev1", 00:23:55.321 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:55.321 "strip_size_kb": 0, 00:23:55.321 "state": "online", 00:23:55.321 "raid_level": "raid1", 00:23:55.321 "superblock": true, 00:23:55.321 "num_base_bdevs": 2, 00:23:55.321 "num_base_bdevs_discovered": 1, 00:23:55.321 "num_base_bdevs_operational": 1, 00:23:55.321 "base_bdevs_list": [ 00:23:55.321 { 00:23:55.321 "name": null, 00:23:55.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.321 "is_configured": false, 00:23:55.321 "data_offset": 256, 00:23:55.321 "data_size": 7936 00:23:55.321 }, 00:23:55.321 { 00:23:55.321 "name": "pt2", 00:23:55.321 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:55.321 "is_configured": true, 00:23:55.321 "data_offset": 256, 00:23:55.321 "data_size": 7936 00:23:55.321 } 00:23:55.321 ] 00:23:55.321 }' 00:23:55.321 04:24:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:55.321 04:24:43 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:55.885 04:24:43 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:56.143 [2024-05-15 04:24:43.998520] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:56.143 [2024-05-15 04:24:43.998548] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:56.143 [2024-05-15 04:24:43.998622] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:56.143 [2024-05-15 04:24:43.998681] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:56.143 [2024-05-15 04:24:43.998697] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x130f8d0 name raid_bdev1, state offline 00:23:56.143 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.143 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # jq -r '.[]' 00:23:56.399 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # raid_bdev= 00:23:56.399 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@501 -- # '[' -n '' ']' 00:23:56.399 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i = 1 )) 00:23:56.399 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:23:56.399 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@507 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:56.654 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i++ )) 00:23:56.654 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # (( i < num_base_bdevs )) 00:23:56.654 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i = 1 )) 00:23:56.654 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # (( i < num_base_bdevs - 1 )) 00:23:56.654 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # i=1 00:23:56.655 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@520 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:56.912 [2024-05-15 04:24:44.716375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:56.912 [2024-05-15 04:24:44.716425] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:56.912 [2024-05-15 04:24:44.716450] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1313ae0 00:23:56.912 [2024-05-15 04:24:44.716465] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:56.912 [2024-05-15 04:24:44.718017] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:56.912 [2024-05-15 04:24:44.718041] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:56.912 [2024-05-15 04:24:44.718093] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:56.912 [2024-05-15 04:24:44.718142] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:56.912 [2024-05-15 04:24:44.718225] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1312bb0 00:23:56.912 [2024-05-15 04:24:44.718239] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:56.912 [2024-05-15 04:24:44.718298] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x130e840 00:23:56.912 [2024-05-15 04:24:44.718384] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1312bb0 00:23:56.912 [2024-05-15 04:24:44.718398] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1312bb0 00:23:56.912 [2024-05-15 04:24:44.718474] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:56.912 pt2 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@523 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.912 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.168 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:57.168 "name": "raid_bdev1", 00:23:57.168 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:57.168 "strip_size_kb": 0, 00:23:57.168 "state": "online", 00:23:57.168 "raid_level": "raid1", 00:23:57.168 "superblock": true, 00:23:57.168 "num_base_bdevs": 2, 00:23:57.168 "num_base_bdevs_discovered": 1, 00:23:57.168 "num_base_bdevs_operational": 1, 00:23:57.168 "base_bdevs_list": [ 00:23:57.168 { 00:23:57.168 "name": null, 00:23:57.168 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:57.168 "is_configured": false, 00:23:57.168 "data_offset": 256, 00:23:57.168 "data_size": 7936 00:23:57.168 }, 00:23:57.168 { 00:23:57.168 "name": "pt2", 00:23:57.168 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:57.168 "is_configured": true, 00:23:57.168 "data_offset": 256, 00:23:57.168 "data_size": 7936 00:23:57.168 } 00:23:57.168 ] 00:23:57.168 }' 00:23:57.168 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:57.168 04:24:44 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:57.732 04:24:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:57.990 [2024-05-15 04:24:45.763132] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:57.990 [2024-05-15 04:24:45.763164] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:57.990 [2024-05-15 04:24:45.763237] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:57.990 [2024-05-15 04:24:45.763298] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:57.990 [2024-05-15 04:24:45.763313] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1312bb0 name raid_bdev1, state offline 00:23:57.990 04:24:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.990 04:24:45 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # jq -r '.[]' 00:23:58.247 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # raid_bdev= 00:23:58.247 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@528 -- # '[' -n '' ']' 00:23:58.247 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@532 -- # '[' 2 -gt 2 ']' 00:23:58.247 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:58.247 [2024-05-15 04:24:46.256420] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:58.247 [2024-05-15 04:24:46.256495] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:58.247 [2024-05-15 04:24:46.256522] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1177f30 00:23:58.247 [2024-05-15 04:24:46.256535] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:58.247 [2024-05-15 04:24:46.257899] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:58.247 [2024-05-15 04:24:46.257922] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:58.247 [2024-05-15 04:24:46.257975] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:58.247 [2024-05-15 04:24:46.258011] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:58.247 [2024-05-15 04:24:46.258120] bdev_raid.c:3487:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:23:58.247 [2024-05-15 04:24:46.258136] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:58.247 [2024-05-15 04:24:46.258154] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1310f50 name raid_bdev1, state configuring 00:23:58.247 [2024-05-15 04:24:46.258179] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:58.247 [2024-05-15 04:24:46.258246] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x1312640 00:23:58.247 [2024-05-15 04:24:46.258259] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:23:58.247 [2024-05-15 04:24:46.258323] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x130e840 00:23:58.247 [2024-05-15 04:24:46.258401] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1312640 00:23:58.247 [2024-05-15 04:24:46.258414] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1312640 00:23:58.247 [2024-05-15 04:24:46.258475] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:58.247 pt1 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # '[' 2 -gt 2 ']' 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:23:58.505 "name": "raid_bdev1", 00:23:58.505 "uuid": "2022e6d9-77f7-4335-b36a-34cda03ec4f7", 00:23:58.505 "strip_size_kb": 0, 00:23:58.505 "state": "online", 00:23:58.505 "raid_level": "raid1", 00:23:58.505 "superblock": true, 00:23:58.505 "num_base_bdevs": 2, 00:23:58.505 "num_base_bdevs_discovered": 1, 00:23:58.505 "num_base_bdevs_operational": 1, 00:23:58.505 "base_bdevs_list": [ 00:23:58.505 { 00:23:58.505 "name": null, 00:23:58.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.505 "is_configured": false, 00:23:58.505 "data_offset": 256, 00:23:58.505 "data_size": 7936 00:23:58.505 }, 00:23:58.505 { 00:23:58.505 "name": "pt2", 00:23:58.505 "uuid": "6e0f4a04-1433-5ee1-9351-527042c5e769", 00:23:58.505 "is_configured": true, 00:23:58.505 "data_offset": 256, 00:23:58.505 "data_size": 7936 00:23:58.505 } 00:23:58.505 ] 00:23:58.505 }' 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:23:58.505 04:24:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:23:59.071 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:23:59.071 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:23:59.329 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # [[ false == \f\a\l\s\e ]] 00:23:59.329 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@558 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:59.329 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@558 -- # jq -r '.[] | .uuid' 00:23:59.587 [2024-05-15 04:24:47.556087] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@558 -- # '[' 2022e6d9-77f7-4335-b36a-34cda03ec4f7 '!=' 2022e6d9-77f7-4335-b36a-34cda03ec4f7 ']' 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@563 -- # killprocess 3951985 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 3951985 ']' 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 3951985 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3951985 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3951985' 00:23:59.587 killing process with pid 3951985 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@965 -- # kill 3951985 00:23:59.587 [2024-05-15 04:24:47.601023] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:59.587 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@970 -- # wait 3951985 00:23:59.587 [2024-05-15 04:24:47.601107] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:59.587 [2024-05-15 04:24:47.601171] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:59.587 [2024-05-15 04:24:47.601187] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1312640 name raid_bdev1, state offline 00:23:59.845 [2024-05-15 04:24:47.624550] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:00.103 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@565 -- # return 0 00:24:00.103 00:24:00.103 real 0m15.392s 00:24:00.103 user 0m28.349s 00:24:00.103 sys 0m2.157s 00:24:00.103 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:00.103 04:24:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:00.103 ************************************ 00:24:00.103 END TEST raid_superblock_test_md_interleaved 00:24:00.103 ************************************ 00:24:00.103 04:24:47 bdev_raid -- bdev/bdev_raid.sh@848 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:24:00.103 04:24:47 bdev_raid -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:24:00.103 04:24:47 bdev_raid -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:00.103 04:24:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:00.103 ************************************ 00:24:00.103 START TEST raid_rebuild_test_sb_md_interleaved 00:24:00.103 ************************************ 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1121 -- # raid_rebuild_test raid1 2 true false false 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local raid_level=raid1 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local num_base_bdevs=2 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local superblock=true 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local background_io=false 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local verify=false 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i = 1 )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # echo BaseBdev1 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # echo BaseBdev2 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i++ )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # (( i <= num_base_bdevs )) 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local base_bdevs 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local raid_bdev_name=raid_bdev1 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local strip_size 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local create_arg 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local raid_bdev_size 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@579 -- # local data_offset 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@581 -- # '[' raid1 '!=' raid1 ']' 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # strip_size=0 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # '[' true = true ']' 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@593 -- # create_arg+=' -s' 00:24:00.103 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # raid_pid=3954036 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@598 -- # waitforlisten 3954036 /var/tmp/spdk-raid.sock 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@827 -- # '[' -z 3954036 ']' 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:00.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:00.104 04:24:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:00.104 [2024-05-15 04:24:48.014385] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:00.104 [2024-05-15 04:24:48.014466] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3954036 ] 00:24:00.104 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:00.104 Zero copy mechanism will not be used. 00:24:00.104 [2024-05-15 04:24:48.091629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:00.362 [2024-05-15 04:24:48.199767] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:00.362 [2024-05-15 04:24:48.271446] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:00.362 [2024-05-15 04:24:48.271490] bdev_raid.c:1433:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:01.295 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:01.295 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # return 0 00:24:01.295 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:24:01.295 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:24:01.295 BaseBdev1_malloc 00:24:01.295 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:01.552 [2024-05-15 04:24:49.565487] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:01.552 [2024-05-15 04:24:49.565538] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:01.552 [2024-05-15 04:24:49.565569] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x80f930 00:24:01.552 [2024-05-15 04:24:49.565584] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:01.552 [2024-05-15 04:24:49.566972] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:01.552 [2024-05-15 04:24:49.567005] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:01.811 BaseBdev1 00:24:01.811 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # for bdev in "${base_bdevs[@]}" 00:24:01.811 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:24:02.069 BaseBdev2_malloc 00:24:02.069 04:24:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:02.327 [2024-05-15 04:24:50.186700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:02.327 [2024-05-15 04:24:50.186776] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.327 [2024-05-15 04:24:50.186804] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x99d070 00:24:02.327 [2024-05-15 04:24:50.186841] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.327 [2024-05-15 04:24:50.188299] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.327 [2024-05-15 04:24:50.188321] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:02.327 BaseBdev2 00:24:02.327 04:24:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:24:02.585 spare_malloc 00:24:02.585 04:24:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:02.843 spare_delay 00:24:02.843 04:24:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@609 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:03.101 [2024-05-15 04:24:51.075322] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:03.101 [2024-05-15 04:24:51.075388] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.101 [2024-05-15 04:24:51.075423] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x99de50 00:24:03.101 [2024-05-15 04:24:51.075439] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.101 [2024-05-15 04:24:51.077081] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.101 [2024-05-15 04:24:51.077108] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:03.101 spare 00:24:03.101 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:03.359 [2024-05-15 04:24:51.344058] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:03.359 [2024-05-15 04:24:51.345418] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:03.359 [2024-05-15 04:24:51.345635] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x9a9d20 00:24:03.359 [2024-05-15 04:24:51.345654] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:03.359 [2024-05-15 04:24:51.345759] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x807c20 00:24:03.359 [2024-05-15 04:24:51.345905] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9a9d20 00:24:03.359 [2024-05-15 04:24:51.345918] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9a9d20 00:24:03.359 [2024-05-15 04:24:51.345999] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.359 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.616 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:03.616 "name": "raid_bdev1", 00:24:03.616 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:03.616 "strip_size_kb": 0, 00:24:03.616 "state": "online", 00:24:03.616 "raid_level": "raid1", 00:24:03.616 "superblock": true, 00:24:03.616 "num_base_bdevs": 2, 00:24:03.616 "num_base_bdevs_discovered": 2, 00:24:03.616 "num_base_bdevs_operational": 2, 00:24:03.616 "base_bdevs_list": [ 00:24:03.616 { 00:24:03.616 "name": "BaseBdev1", 00:24:03.616 "uuid": "a0a61b58-083f-598b-838c-e038df8b24d9", 00:24:03.616 "is_configured": true, 00:24:03.616 "data_offset": 256, 00:24:03.616 "data_size": 7936 00:24:03.616 }, 00:24:03.616 { 00:24:03.616 "name": "BaseBdev2", 00:24:03.616 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:03.616 "is_configured": true, 00:24:03.616 "data_offset": 256, 00:24:03.617 "data_size": 7936 00:24:03.617 } 00:24:03.617 ] 00:24:03.617 }' 00:24:03.617 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:03.617 04:24:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:04.182 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:04.182 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # jq -r '.[].num_blocks' 00:24:04.440 [2024-05-15 04:24:52.374983] bdev_raid.c:1124:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:04.440 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # raid_bdev_size=7936 00:24:04.440 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@619 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.440 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@619 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:04.698 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@619 -- # data_offset=256 00:24:04.698 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@621 -- # '[' false = true ']' 00:24:04.698 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # '[' false = true ']' 00:24:04.698 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:04.955 [2024-05-15 04:24:52.876116] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@643 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:04.955 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.956 04:24:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.213 04:24:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:05.213 "name": "raid_bdev1", 00:24:05.213 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:05.213 "strip_size_kb": 0, 00:24:05.213 "state": "online", 00:24:05.213 "raid_level": "raid1", 00:24:05.213 "superblock": true, 00:24:05.213 "num_base_bdevs": 2, 00:24:05.213 "num_base_bdevs_discovered": 1, 00:24:05.213 "num_base_bdevs_operational": 1, 00:24:05.214 "base_bdevs_list": [ 00:24:05.214 { 00:24:05.214 "name": null, 00:24:05.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.214 "is_configured": false, 00:24:05.214 "data_offset": 256, 00:24:05.214 "data_size": 7936 00:24:05.214 }, 00:24:05.214 { 00:24:05.214 "name": "BaseBdev2", 00:24:05.214 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:05.214 "is_configured": true, 00:24:05.214 "data_offset": 256, 00:24:05.214 "data_size": 7936 00:24:05.214 } 00:24:05.214 ] 00:24:05.214 }' 00:24:05.214 04:24:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:05.214 04:24:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:05.778 04:24:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.036 [2024-05-15 04:24:53.922924] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.036 [2024-05-15 04:24:53.927096] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a08b0 00:24:06.036 [2024-05-15 04:24:53.928769] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:06.036 04:24:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@647 -- # sleep 1 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@650 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.969 04:24:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.226 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:07.226 "name": "raid_bdev1", 00:24:07.226 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:07.226 "strip_size_kb": 0, 00:24:07.226 "state": "online", 00:24:07.226 "raid_level": "raid1", 00:24:07.226 "superblock": true, 00:24:07.226 "num_base_bdevs": 2, 00:24:07.226 "num_base_bdevs_discovered": 2, 00:24:07.226 "num_base_bdevs_operational": 2, 00:24:07.226 "process": { 00:24:07.226 "type": "rebuild", 00:24:07.226 "target": "spare", 00:24:07.226 "progress": { 00:24:07.226 "blocks": 3072, 00:24:07.226 "percent": 38 00:24:07.226 } 00:24:07.226 }, 00:24:07.226 "base_bdevs_list": [ 00:24:07.226 { 00:24:07.226 "name": "spare", 00:24:07.226 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:07.226 "is_configured": true, 00:24:07.226 "data_offset": 256, 00:24:07.226 "data_size": 7936 00:24:07.226 }, 00:24:07.226 { 00:24:07.226 "name": "BaseBdev2", 00:24:07.226 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:07.226 "is_configured": true, 00:24:07.226 "data_offset": 256, 00:24:07.226 "data_size": 7936 00:24:07.226 } 00:24:07.226 ] 00:24:07.226 }' 00:24:07.226 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:07.226 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.226 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:07.484 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.484 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:07.742 [2024-05-15 04:24:55.508301] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:07.742 [2024-05-15 04:24:55.541972] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:07.742 [2024-05-15 04:24:55.542028] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@656 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.742 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.000 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:08.000 "name": "raid_bdev1", 00:24:08.000 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:08.000 "strip_size_kb": 0, 00:24:08.000 "state": "online", 00:24:08.000 "raid_level": "raid1", 00:24:08.000 "superblock": true, 00:24:08.000 "num_base_bdevs": 2, 00:24:08.000 "num_base_bdevs_discovered": 1, 00:24:08.000 "num_base_bdevs_operational": 1, 00:24:08.000 "base_bdevs_list": [ 00:24:08.000 { 00:24:08.000 "name": null, 00:24:08.000 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.000 "is_configured": false, 00:24:08.000 "data_offset": 256, 00:24:08.000 "data_size": 7936 00:24:08.000 }, 00:24:08.000 { 00:24:08.000 "name": "BaseBdev2", 00:24:08.000 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:08.000 "is_configured": true, 00:24:08.000 "data_offset": 256, 00:24:08.000 "data_size": 7936 00:24:08.000 } 00:24:08.000 ] 00:24:08.000 }' 00:24:08.000 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:08.000 04:24:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:08.565 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@659 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:08.566 "name": "raid_bdev1", 00:24:08.566 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:08.566 "strip_size_kb": 0, 00:24:08.566 "state": "online", 00:24:08.566 "raid_level": "raid1", 00:24:08.566 "superblock": true, 00:24:08.566 "num_base_bdevs": 2, 00:24:08.566 "num_base_bdevs_discovered": 1, 00:24:08.566 "num_base_bdevs_operational": 1, 00:24:08.566 "base_bdevs_list": [ 00:24:08.566 { 00:24:08.566 "name": null, 00:24:08.566 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.566 "is_configured": false, 00:24:08.566 "data_offset": 256, 00:24:08.566 "data_size": 7936 00:24:08.566 }, 00:24:08.566 { 00:24:08.566 "name": "BaseBdev2", 00:24:08.566 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:08.566 "is_configured": true, 00:24:08.566 "data_offset": 256, 00:24:08.566 "data_size": 7936 00:24:08.566 } 00:24:08.566 ] 00:24:08.566 }' 00:24:08.566 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:08.823 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:08.823 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:08.823 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:08.823 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:09.081 [2024-05-15 04:24:56.874259] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:09.081 [2024-05-15 04:24:56.879330] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x807c20 00:24:09.081 [2024-05-15 04:24:56.880688] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:09.081 04:24:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # sleep 1 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@664 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.015 04:24:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:10.274 "name": "raid_bdev1", 00:24:10.274 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:10.274 "strip_size_kb": 0, 00:24:10.274 "state": "online", 00:24:10.274 "raid_level": "raid1", 00:24:10.274 "superblock": true, 00:24:10.274 "num_base_bdevs": 2, 00:24:10.274 "num_base_bdevs_discovered": 2, 00:24:10.274 "num_base_bdevs_operational": 2, 00:24:10.274 "process": { 00:24:10.274 "type": "rebuild", 00:24:10.274 "target": "spare", 00:24:10.274 "progress": { 00:24:10.274 "blocks": 3072, 00:24:10.274 "percent": 38 00:24:10.274 } 00:24:10.274 }, 00:24:10.274 "base_bdevs_list": [ 00:24:10.274 { 00:24:10.274 "name": "spare", 00:24:10.274 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:10.274 "is_configured": true, 00:24:10.274 "data_offset": 256, 00:24:10.274 "data_size": 7936 00:24:10.274 }, 00:24:10.274 { 00:24:10.274 "name": "BaseBdev2", 00:24:10.274 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:10.274 "is_configured": true, 00:24:10.274 "data_offset": 256, 00:24:10.274 "data_size": 7936 00:24:10.274 } 00:24:10.274 ] 00:24:10.274 }' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@666 -- # '[' true = true ']' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@666 -- # '[' = false ']' 00:24:10.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 666: [: =: unary operator expected 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@691 -- # local num_base_bdevs_operational=2 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@693 -- # '[' raid1 = raid1 ']' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@693 -- # '[' 2 -gt 2 ']' 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # local timeout=974 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.274 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:10.532 "name": "raid_bdev1", 00:24:10.532 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:10.532 "strip_size_kb": 0, 00:24:10.532 "state": "online", 00:24:10.532 "raid_level": "raid1", 00:24:10.532 "superblock": true, 00:24:10.532 "num_base_bdevs": 2, 00:24:10.532 "num_base_bdevs_discovered": 2, 00:24:10.532 "num_base_bdevs_operational": 2, 00:24:10.532 "process": { 00:24:10.532 "type": "rebuild", 00:24:10.532 "target": "spare", 00:24:10.532 "progress": { 00:24:10.532 "blocks": 3840, 00:24:10.532 "percent": 48 00:24:10.532 } 00:24:10.532 }, 00:24:10.532 "base_bdevs_list": [ 00:24:10.532 { 00:24:10.532 "name": "spare", 00:24:10.532 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:10.532 "is_configured": true, 00:24:10.532 "data_offset": 256, 00:24:10.532 "data_size": 7936 00:24:10.532 }, 00:24:10.532 { 00:24:10.532 "name": "BaseBdev2", 00:24:10.532 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:10.532 "is_configured": true, 00:24:10.532 "data_offset": 256, 00:24:10.532 "data_size": 7936 00:24:10.532 } 00:24:10.532 ] 00:24:10.532 }' 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.532 04:24:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@711 -- # sleep 1 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:11.905 "name": "raid_bdev1", 00:24:11.905 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:11.905 "strip_size_kb": 0, 00:24:11.905 "state": "online", 00:24:11.905 "raid_level": "raid1", 00:24:11.905 "superblock": true, 00:24:11.905 "num_base_bdevs": 2, 00:24:11.905 "num_base_bdevs_discovered": 2, 00:24:11.905 "num_base_bdevs_operational": 2, 00:24:11.905 "process": { 00:24:11.905 "type": "rebuild", 00:24:11.905 "target": "spare", 00:24:11.905 "progress": { 00:24:11.905 "blocks": 7168, 00:24:11.905 "percent": 90 00:24:11.905 } 00:24:11.905 }, 00:24:11.905 "base_bdevs_list": [ 00:24:11.905 { 00:24:11.905 "name": "spare", 00:24:11.905 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:11.905 "is_configured": true, 00:24:11.905 "data_offset": 256, 00:24:11.905 "data_size": 7936 00:24:11.905 }, 00:24:11.905 { 00:24:11.905 "name": "BaseBdev2", 00:24:11.905 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:11.905 "is_configured": true, 00:24:11.905 "data_offset": 256, 00:24:11.905 "data_size": 7936 00:24:11.905 } 00:24:11.905 ] 00:24:11.905 }' 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:11.905 04:24:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@711 -- # sleep 1 00:24:12.164 [2024-05-15 04:25:00.006066] bdev_raid.c:2757:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:12.164 [2024-05-15 04:25:00.006128] bdev_raid.c:2474:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:12.164 [2024-05-15 04:25:00.006250] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # (( SECONDS < timeout )) 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.131 04:25:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.131 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:13.131 "name": "raid_bdev1", 00:24:13.131 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:13.131 "strip_size_kb": 0, 00:24:13.131 "state": "online", 00:24:13.131 "raid_level": "raid1", 00:24:13.131 "superblock": true, 00:24:13.131 "num_base_bdevs": 2, 00:24:13.131 "num_base_bdevs_discovered": 2, 00:24:13.131 "num_base_bdevs_operational": 2, 00:24:13.131 "base_bdevs_list": [ 00:24:13.131 { 00:24:13.131 "name": "spare", 00:24:13.131 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:13.131 "is_configured": true, 00:24:13.131 "data_offset": 256, 00:24:13.131 "data_size": 7936 00:24:13.131 }, 00:24:13.131 { 00:24:13.131 "name": "BaseBdev2", 00:24:13.131 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:13.131 "is_configured": true, 00:24:13.131 "data_offset": 256, 00:24:13.131 "data_size": 7936 00:24:13.131 } 00:24:13.131 ] 00:24:13.131 }' 00:24:13.131 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \s\p\a\r\e ]] 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@709 -- # break 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.390 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:13.648 "name": "raid_bdev1", 00:24:13.648 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:13.648 "strip_size_kb": 0, 00:24:13.648 "state": "online", 00:24:13.648 "raid_level": "raid1", 00:24:13.648 "superblock": true, 00:24:13.648 "num_base_bdevs": 2, 00:24:13.648 "num_base_bdevs_discovered": 2, 00:24:13.648 "num_base_bdevs_operational": 2, 00:24:13.648 "base_bdevs_list": [ 00:24:13.648 { 00:24:13.648 "name": "spare", 00:24:13.648 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:13.648 "is_configured": true, 00:24:13.648 "data_offset": 256, 00:24:13.648 "data_size": 7936 00:24:13.648 }, 00:24:13.648 { 00:24:13.648 "name": "BaseBdev2", 00:24:13.648 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:13.648 "is_configured": true, 00:24:13.648 "data_offset": 256, 00:24:13.648 "data_size": 7936 00:24:13.648 } 00:24:13.648 ] 00:24:13.648 }' 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@716 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.648 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.907 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:13.907 "name": "raid_bdev1", 00:24:13.907 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:13.907 "strip_size_kb": 0, 00:24:13.907 "state": "online", 00:24:13.907 "raid_level": "raid1", 00:24:13.907 "superblock": true, 00:24:13.907 "num_base_bdevs": 2, 00:24:13.907 "num_base_bdevs_discovered": 2, 00:24:13.907 "num_base_bdevs_operational": 2, 00:24:13.907 "base_bdevs_list": [ 00:24:13.907 { 00:24:13.907 "name": "spare", 00:24:13.907 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:13.907 "is_configured": true, 00:24:13.907 "data_offset": 256, 00:24:13.907 "data_size": 7936 00:24:13.907 }, 00:24:13.907 { 00:24:13.907 "name": "BaseBdev2", 00:24:13.907 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:13.907 "is_configured": true, 00:24:13.907 "data_offset": 256, 00:24:13.907 "data_size": 7936 00:24:13.907 } 00:24:13.907 ] 00:24:13.907 }' 00:24:13.907 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:13.907 04:25:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:14.471 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:14.729 [2024-05-15 04:25:02.682522] bdev_raid.c:2326:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:14.729 [2024-05-15 04:25:02.682555] bdev_raid.c:1861:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:14.729 [2024-05-15 04:25:02.682643] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:14.729 [2024-05-15 04:25:02.682721] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:14.729 [2024-05-15 04:25:02.682738] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9a9d20 name raid_bdev1, state offline 00:24:14.729 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@720 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.729 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@720 -- # jq length 00:24:14.987 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@720 -- # [[ 0 == 0 ]] 00:24:14.987 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:14.987 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@743 -- # '[' true = true ']' 00:24:14.988 04:25:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:15.245 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@746 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:15.503 [2024-05-15 04:25:03.452526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:15.503 [2024-05-15 04:25:03.452578] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.503 [2024-05-15 04:25:03.452600] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x808ad0 00:24:15.503 [2024-05-15 04:25:03.452613] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.503 [2024-05-15 04:25:03.454178] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.503 [2024-05-15 04:25:03.454202] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:15.503 [2024-05-15 04:25:03.454265] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:15.503 [2024-05-15 04:25:03.454300] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:15.503 [2024-05-15 04:25:03.454390] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:15.503 spare 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=2 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.503 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.760 [2024-05-15 04:25:03.554712] bdev_raid.c:1711:raid_bdev_configure_cont: *DEBUG*: io device register 0x807f50 00:24:15.760 [2024-05-15 04:25:03.554732] bdev_raid.c:1712:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:24:15.760 [2024-05-15 04:25:03.554798] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9a1d60 00:24:15.760 [2024-05-15 04:25:03.554924] bdev_raid.c:1741:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x807f50 00:24:15.760 [2024-05-15 04:25:03.554941] bdev_raid.c:1742:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x807f50 00:24:15.760 [2024-05-15 04:25:03.555020] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:15.760 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:15.760 "name": "raid_bdev1", 00:24:15.760 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:15.760 "strip_size_kb": 0, 00:24:15.760 "state": "online", 00:24:15.760 "raid_level": "raid1", 00:24:15.760 "superblock": true, 00:24:15.760 "num_base_bdevs": 2, 00:24:15.760 "num_base_bdevs_discovered": 2, 00:24:15.760 "num_base_bdevs_operational": 2, 00:24:15.760 "base_bdevs_list": [ 00:24:15.760 { 00:24:15.760 "name": "spare", 00:24:15.760 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:15.760 "is_configured": true, 00:24:15.760 "data_offset": 256, 00:24:15.760 "data_size": 7936 00:24:15.760 }, 00:24:15.760 { 00:24:15.760 "name": "BaseBdev2", 00:24:15.760 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:15.760 "is_configured": true, 00:24:15.760 "data_offset": 256, 00:24:15.760 "data_size": 7936 00:24:15.760 } 00:24:15.760 ] 00:24:15.760 }' 00:24:15.760 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:15.760 04:25:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.327 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.585 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:16.585 "name": "raid_bdev1", 00:24:16.585 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:16.585 "strip_size_kb": 0, 00:24:16.585 "state": "online", 00:24:16.585 "raid_level": "raid1", 00:24:16.585 "superblock": true, 00:24:16.585 "num_base_bdevs": 2, 00:24:16.585 "num_base_bdevs_discovered": 2, 00:24:16.585 "num_base_bdevs_operational": 2, 00:24:16.585 "base_bdevs_list": [ 00:24:16.585 { 00:24:16.585 "name": "spare", 00:24:16.585 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:16.585 "is_configured": true, 00:24:16.585 "data_offset": 256, 00:24:16.585 "data_size": 7936 00:24:16.585 }, 00:24:16.585 { 00:24:16.585 "name": "BaseBdev2", 00:24:16.585 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:16.585 "is_configured": true, 00:24:16.585 "data_offset": 256, 00:24:16.585 "data_size": 7936 00:24:16.585 } 00:24:16.585 ] 00:24:16.585 }' 00:24:16.585 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:16.585 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:16.585 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:16.843 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:16.843 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.843 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:17.101 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@750 -- # [[ spare == \s\p\a\r\e ]] 00:24:17.101 04:25:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:17.366 [2024-05-15 04:25:05.133095] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:17.366 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:17.367 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:17.367 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:17.367 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.367 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.628 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:17.628 "name": "raid_bdev1", 00:24:17.628 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:17.628 "strip_size_kb": 0, 00:24:17.628 "state": "online", 00:24:17.628 "raid_level": "raid1", 00:24:17.628 "superblock": true, 00:24:17.628 "num_base_bdevs": 2, 00:24:17.628 "num_base_bdevs_discovered": 1, 00:24:17.628 "num_base_bdevs_operational": 1, 00:24:17.628 "base_bdevs_list": [ 00:24:17.628 { 00:24:17.628 "name": null, 00:24:17.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:17.628 "is_configured": false, 00:24:17.628 "data_offset": 256, 00:24:17.628 "data_size": 7936 00:24:17.628 }, 00:24:17.628 { 00:24:17.628 "name": "BaseBdev2", 00:24:17.628 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:17.628 "is_configured": true, 00:24:17.628 "data_offset": 256, 00:24:17.628 "data_size": 7936 00:24:17.628 } 00:24:17.628 ] 00:24:17.628 }' 00:24:17.628 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:17.628 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:18.194 04:25:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:18.194 [2024-05-15 04:25:06.167888] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:18.194 [2024-05-15 04:25:06.168085] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:18.194 [2024-05-15 04:25:06.168107] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:18.194 [2024-05-15 04:25:06.168142] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:18.194 [2024-05-15 04:25:06.172636] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x807cb0 00:24:18.194 [2024-05-15 04:25:06.174742] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:18.194 04:25:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # sleep 1 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@757 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:19.569 "name": "raid_bdev1", 00:24:19.569 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:19.569 "strip_size_kb": 0, 00:24:19.569 "state": "online", 00:24:19.569 "raid_level": "raid1", 00:24:19.569 "superblock": true, 00:24:19.569 "num_base_bdevs": 2, 00:24:19.569 "num_base_bdevs_discovered": 2, 00:24:19.569 "num_base_bdevs_operational": 2, 00:24:19.569 "process": { 00:24:19.569 "type": "rebuild", 00:24:19.569 "target": "spare", 00:24:19.569 "progress": { 00:24:19.569 "blocks": 3072, 00:24:19.569 "percent": 38 00:24:19.569 } 00:24:19.569 }, 00:24:19.569 "base_bdevs_list": [ 00:24:19.569 { 00:24:19.569 "name": "spare", 00:24:19.569 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:19.569 "is_configured": true, 00:24:19.569 "data_offset": 256, 00:24:19.569 "data_size": 7936 00:24:19.569 }, 00:24:19.569 { 00:24:19.569 "name": "BaseBdev2", 00:24:19.569 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:19.569 "is_configured": true, 00:24:19.569 "data_offset": 256, 00:24:19.569 "data_size": 7936 00:24:19.569 } 00:24:19.569 ] 00:24:19.569 }' 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:19.569 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:19.827 [2024-05-15 04:25:07.725773] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:19.827 [2024-05-15 04:25:07.787914] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:19.827 [2024-05-15 04:25:07.787986] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:19.827 04:25:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:20.085 04:25:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:20.085 "name": "raid_bdev1", 00:24:20.085 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:20.085 "strip_size_kb": 0, 00:24:20.085 "state": "online", 00:24:20.085 "raid_level": "raid1", 00:24:20.085 "superblock": true, 00:24:20.085 "num_base_bdevs": 2, 00:24:20.085 "num_base_bdevs_discovered": 1, 00:24:20.085 "num_base_bdevs_operational": 1, 00:24:20.085 "base_bdevs_list": [ 00:24:20.085 { 00:24:20.085 "name": null, 00:24:20.085 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:20.085 "is_configured": false, 00:24:20.085 "data_offset": 256, 00:24:20.085 "data_size": 7936 00:24:20.085 }, 00:24:20.085 { 00:24:20.085 "name": "BaseBdev2", 00:24:20.085 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:20.085 "is_configured": true, 00:24:20.085 "data_offset": 256, 00:24:20.085 "data_size": 7936 00:24:20.085 } 00:24:20.085 ] 00:24:20.085 }' 00:24:20.085 04:25:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:20.085 04:25:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:20.651 04:25:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:20.909 [2024-05-15 04:25:08.855656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:20.909 [2024-05-15 04:25:08.855724] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.909 [2024-05-15 04:25:08.855748] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9987f0 00:24:20.909 [2024-05-15 04:25:08.855761] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.909 [2024-05-15 04:25:08.856012] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.909 [2024-05-15 04:25:08.856034] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:20.909 [2024-05-15 04:25:08.856097] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:20.909 [2024-05-15 04:25:08.856129] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:20.909 [2024-05-15 04:25:08.856139] bdev_raid.c:3560:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:20.909 [2024-05-15 04:25:08.856159] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:20.909 [2024-05-15 04:25:08.860157] bdev_raid.c: 232:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x807020 00:24:20.909 spare 00:24:20.909 [2024-05-15 04:25:08.861569] bdev_raid.c:2792:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:20.909 04:25:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # sleep 1 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=rebuild 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=spare 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.282 04:25:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:22.282 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:22.282 "name": "raid_bdev1", 00:24:22.282 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:22.282 "strip_size_kb": 0, 00:24:22.282 "state": "online", 00:24:22.282 "raid_level": "raid1", 00:24:22.282 "superblock": true, 00:24:22.282 "num_base_bdevs": 2, 00:24:22.282 "num_base_bdevs_discovered": 2, 00:24:22.282 "num_base_bdevs_operational": 2, 00:24:22.282 "process": { 00:24:22.283 "type": "rebuild", 00:24:22.283 "target": "spare", 00:24:22.283 "progress": { 00:24:22.283 "blocks": 3072, 00:24:22.283 "percent": 38 00:24:22.283 } 00:24:22.283 }, 00:24:22.283 "base_bdevs_list": [ 00:24:22.283 { 00:24:22.283 "name": "spare", 00:24:22.283 "uuid": "c288ea04-3f32-51bd-9273-059c90a0b1d2", 00:24:22.283 "is_configured": true, 00:24:22.283 "data_offset": 256, 00:24:22.283 "data_size": 7936 00:24:22.283 }, 00:24:22.283 { 00:24:22.283 "name": "BaseBdev2", 00:24:22.283 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:22.283 "is_configured": true, 00:24:22.283 "data_offset": 256, 00:24:22.283 "data_size": 7936 00:24:22.283 } 00:24:22.283 ] 00:24:22.283 }' 00:24:22.283 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:22.283 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:22.283 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:22.283 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ spare == \s\p\a\r\e ]] 00:24:22.283 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:22.541 [2024-05-15 04:25:10.481348] bdev_raid.c:2127:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:22.799 [2024-05-15 04:25:10.575571] bdev_raid.c:2483:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:22.799 [2024-05-15 04:25:10.575632] bdev_raid.c: 312:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.799 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.057 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:23.057 "name": "raid_bdev1", 00:24:23.057 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:23.057 "strip_size_kb": 0, 00:24:23.057 "state": "online", 00:24:23.057 "raid_level": "raid1", 00:24:23.057 "superblock": true, 00:24:23.057 "num_base_bdevs": 2, 00:24:23.057 "num_base_bdevs_discovered": 1, 00:24:23.057 "num_base_bdevs_operational": 1, 00:24:23.057 "base_bdevs_list": [ 00:24:23.057 { 00:24:23.057 "name": null, 00:24:23.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.057 "is_configured": false, 00:24:23.057 "data_offset": 256, 00:24:23.057 "data_size": 7936 00:24:23.057 }, 00:24:23.057 { 00:24:23.057 "name": "BaseBdev2", 00:24:23.057 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:23.057 "is_configured": true, 00:24:23.057 "data_offset": 256, 00:24:23.057 "data_size": 7936 00:24:23.057 } 00:24:23.057 ] 00:24:23.057 }' 00:24:23.057 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:23.057 04:25:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.621 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:23.880 "name": "raid_bdev1", 00:24:23.880 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:23.880 "strip_size_kb": 0, 00:24:23.880 "state": "online", 00:24:23.880 "raid_level": "raid1", 00:24:23.880 "superblock": true, 00:24:23.880 "num_base_bdevs": 2, 00:24:23.880 "num_base_bdevs_discovered": 1, 00:24:23.880 "num_base_bdevs_operational": 1, 00:24:23.880 "base_bdevs_list": [ 00:24:23.880 { 00:24:23.880 "name": null, 00:24:23.880 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.880 "is_configured": false, 00:24:23.880 "data_offset": 256, 00:24:23.880 "data_size": 7936 00:24:23.880 }, 00:24:23.880 { 00:24:23.880 "name": "BaseBdev2", 00:24:23.880 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:23.880 "is_configured": true, 00:24:23.880 "data_offset": 256, 00:24:23.880 "data_size": 7936 00:24:23.880 } 00:24:23.880 ] 00:24:23.880 }' 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:23.880 04:25:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:24.138 04:25:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:24.396 [2024-05-15 04:25:12.284987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:24.396 [2024-05-15 04:25:12.285047] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:24.396 [2024-05-15 04:25:12.285076] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x997ea0 00:24:24.396 [2024-05-15 04:25:12.285089] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:24.396 [2024-05-15 04:25:12.285345] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:24.396 [2024-05-15 04:25:12.285367] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:24.396 [2024-05-15 04:25:12.285422] bdev_raid.c:3692:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:24.396 [2024-05-15 04:25:12.285439] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:24.396 [2024-05-15 04:25:12.285456] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:24.396 BaseBdev1 00:24:24.396 04:25:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # sleep 1 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.330 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.600 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:25.600 "name": "raid_bdev1", 00:24:25.600 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:25.600 "strip_size_kb": 0, 00:24:25.600 "state": "online", 00:24:25.600 "raid_level": "raid1", 00:24:25.600 "superblock": true, 00:24:25.600 "num_base_bdevs": 2, 00:24:25.600 "num_base_bdevs_discovered": 1, 00:24:25.600 "num_base_bdevs_operational": 1, 00:24:25.600 "base_bdevs_list": [ 00:24:25.600 { 00:24:25.600 "name": null, 00:24:25.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.600 "is_configured": false, 00:24:25.600 "data_offset": 256, 00:24:25.600 "data_size": 7936 00:24:25.600 }, 00:24:25.600 { 00:24:25.600 "name": "BaseBdev2", 00:24:25.600 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:25.600 "is_configured": true, 00:24:25.600 "data_offset": 256, 00:24:25.600 "data_size": 7936 00:24:25.600 } 00:24:25.600 ] 00:24:25.600 }' 00:24:25.600 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:25.601 04:25:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.171 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.429 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:26.429 "name": "raid_bdev1", 00:24:26.429 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:26.429 "strip_size_kb": 0, 00:24:26.429 "state": "online", 00:24:26.429 "raid_level": "raid1", 00:24:26.429 "superblock": true, 00:24:26.429 "num_base_bdevs": 2, 00:24:26.429 "num_base_bdevs_discovered": 1, 00:24:26.429 "num_base_bdevs_operational": 1, 00:24:26.429 "base_bdevs_list": [ 00:24:26.429 { 00:24:26.429 "name": null, 00:24:26.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.429 "is_configured": false, 00:24:26.429 "data_offset": 256, 00:24:26.429 "data_size": 7936 00:24:26.429 }, 00:24:26.429 { 00:24:26.429 "name": "BaseBdev2", 00:24:26.429 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:26.429 "is_configured": true, 00:24:26.429 "data_offset": 256, 00:24:26.429 "data_size": 7936 00:24:26.429 } 00:24:26.429 ] 00:24:26.429 }' 00:24:26.429 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:26.429 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:26.429 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:26.687 [2024-05-15 04:25:14.679384] bdev_raid.c:3138:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:26.687 [2024-05-15 04:25:14.679557] bdev_raid.c:3502:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:26.687 [2024-05-15 04:25:14.679577] bdev_raid.c:3521:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:26.687 request: 00:24:26.687 { 00:24:26.687 "raid_bdev": "raid_bdev1", 00:24:26.687 "base_bdev": "BaseBdev1", 00:24:26.687 "method": "bdev_raid_add_base_bdev", 00:24:26.687 "req_id": 1 00:24:26.687 } 00:24:26.687 Got JSON-RPC error response 00:24:26.687 response: 00:24:26.687 { 00:24:26.687 "code": -22, 00:24:26.687 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:26.687 } 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:26.687 04:25:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # sleep 1 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local raid_bdev_name=raid_bdev1 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local expected_state=online 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local raid_level=raid1 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local strip_size=0 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local num_base_bdevs_operational=1 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local raid_bdev_info 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local num_base_bdevs_discovered 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@125 -- # local tmp 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@127 -- # raid_bdev_info='{ 00:24:28.061 "name": "raid_bdev1", 00:24:28.061 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:28.061 "strip_size_kb": 0, 00:24:28.061 "state": "online", 00:24:28.061 "raid_level": "raid1", 00:24:28.061 "superblock": true, 00:24:28.061 "num_base_bdevs": 2, 00:24:28.061 "num_base_bdevs_discovered": 1, 00:24:28.061 "num_base_bdevs_operational": 1, 00:24:28.061 "base_bdevs_list": [ 00:24:28.061 { 00:24:28.061 "name": null, 00:24:28.061 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.061 "is_configured": false, 00:24:28.061 "data_offset": 256, 00:24:28.061 "data_size": 7936 00:24:28.061 }, 00:24:28.061 { 00:24:28.061 "name": "BaseBdev2", 00:24:28.061 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:28.061 "is_configured": true, 00:24:28.061 "data_offset": 256, 00:24:28.061 "data_size": 7936 00:24:28.061 } 00:24:28.061 ] 00:24:28.061 }' 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@129 -- # xtrace_disable 00:24:28.061 04:25:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@780 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local raid_bdev_name=raid_bdev1 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local process_type=none 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local target=none 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@186 -- # local raid_bdev_info 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.627 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.885 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@188 -- # raid_bdev_info='{ 00:24:28.885 "name": "raid_bdev1", 00:24:28.885 "uuid": "55a340a7-4815-472b-95ff-336ec0938e2b", 00:24:28.885 "strip_size_kb": 0, 00:24:28.885 "state": "online", 00:24:28.885 "raid_level": "raid1", 00:24:28.885 "superblock": true, 00:24:28.885 "num_base_bdevs": 2, 00:24:28.885 "num_base_bdevs_discovered": 1, 00:24:28.885 "num_base_bdevs_operational": 1, 00:24:28.885 "base_bdevs_list": [ 00:24:28.885 { 00:24:28.885 "name": null, 00:24:28.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.885 "is_configured": false, 00:24:28.885 "data_offset": 256, 00:24:28.885 "data_size": 7936 00:24:28.885 }, 00:24:28.885 { 00:24:28.885 "name": "BaseBdev2", 00:24:28.885 "uuid": "34e377cc-60e6-5b26-a64a-a650a4c51700", 00:24:28.885 "is_configured": true, 00:24:28.885 "data_offset": 256, 00:24:28.885 "data_size": 7936 00:24:28.885 } 00:24:28.885 ] 00:24:28.885 }' 00:24:28.885 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.type // "none"' 00:24:28.885 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:28.885 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # jq -r '.process.target // "none"' 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@191 -- # [[ none == \n\o\n\e ]] 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@783 -- # killprocess 3954036 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@946 -- # '[' -z 3954036 ']' 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # kill -0 3954036 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # uname 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3954036 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3954036' 00:24:29.144 killing process with pid 3954036 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@965 -- # kill 3954036 00:24:29.144 Received shutdown signal, test time was about 60.000000 seconds 00:24:29.144 00:24:29.144 Latency(us) 00:24:29.144 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:29.144 =================================================================================================================== 00:24:29.144 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:29.144 [2024-05-15 04:25:16.935394] bdev_raid.c:1375:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:29.144 [2024-05-15 04:25:16.935500] bdev_raid.c: 453:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:29.144 04:25:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@970 -- # wait 3954036 00:24:29.144 [2024-05-15 04:25:16.935558] bdev_raid.c: 430:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:29.144 [2024-05-15 04:25:16.935572] bdev_raid.c: 347:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x807f50 name raid_bdev1, state offline 00:24:29.144 [2024-05-15 04:25:16.970496] bdev_raid.c:1392:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:29.402 04:25:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@785 -- # return 0 00:24:29.402 00:24:29.402 real 0m29.275s 00:24:29.402 user 0m47.280s 00:24:29.402 sys 0m2.972s 00:24:29.402 04:25:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:29.402 04:25:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:24:29.402 ************************************ 00:24:29.402 END TEST raid_rebuild_test_sb_md_interleaved 00:24:29.402 ************************************ 00:24:29.402 04:25:17 bdev_raid -- bdev/bdev_raid.sh@850 -- # rm -f /raidrandtest 00:24:29.402 00:24:29.402 real 16m3.255s 00:24:29.402 user 28m1.093s 00:24:29.402 sys 2m16.533s 00:24:29.402 04:25:17 bdev_raid -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:29.402 04:25:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:29.402 ************************************ 00:24:29.402 END TEST bdev_raid 00:24:29.402 ************************************ 00:24:29.402 04:25:17 -- spdk/autotest.sh@187 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:24:29.402 04:25:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:24:29.402 04:25:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:29.402 04:25:17 -- common/autotest_common.sh@10 -- # set +x 00:24:29.402 ************************************ 00:24:29.402 START TEST bdevperf_config 00:24:29.402 ************************************ 00:24:29.402 04:25:17 bdevperf_config -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:24:29.402 * Looking for test storage... 00:24:29.402 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:29.402 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:29.402 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:29.402 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:29.402 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:29.402 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:29.402 04:25:17 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:32.678 04:25:20 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-05-15 04:25:17.427942] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:32.678 [2024-05-15 04:25:17.428028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3957946 ] 00:24:32.678 Using job config with 4 jobs 00:24:32.678 [2024-05-15 04:25:17.527728] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.678 [2024-05-15 04:25:17.655595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:32.678 cpumask for '\''job0'\'' is too big 00:24:32.678 cpumask for '\''job1'\'' is too big 00:24:32.678 cpumask for '\''job2'\'' is too big 00:24:32.678 cpumask for '\''job3'\'' is too big 00:24:32.678 Running I/O for 2 seconds... 00:24:32.678 00:24:32.678 Latency(us) 00:24:32.678 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24117.22 23.55 0.00 0.00 10600.87 1917.53 16311.18 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24096.52 23.53 0.00 0.00 10586.21 1868.99 14466.47 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24076.06 23.51 0.00 0.00 10570.96 1868.99 12524.66 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24149.81 23.58 0.00 0.00 10515.43 873.81 10922.67 00:24:32.679 =================================================================================================================== 00:24:32.679 Total : 96439.61 94.18 0.00 0.00 10568.30 873.81 16311.18' 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-05-15 04:25:17.427942] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:32.679 [2024-05-15 04:25:17.428028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3957946 ] 00:24:32.679 Using job config with 4 jobs 00:24:32.679 [2024-05-15 04:25:17.527728] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.679 [2024-05-15 04:25:17.655595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:32.679 cpumask for '\''job0'\'' is too big 00:24:32.679 cpumask for '\''job1'\'' is too big 00:24:32.679 cpumask for '\''job2'\'' is too big 00:24:32.679 cpumask for '\''job3'\'' is too big 00:24:32.679 Running I/O for 2 seconds... 00:24:32.679 00:24:32.679 Latency(us) 00:24:32.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24117.22 23.55 0.00 0.00 10600.87 1917.53 16311.18 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24096.52 23.53 0.00 0.00 10586.21 1868.99 14466.47 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24076.06 23.51 0.00 0.00 10570.96 1868.99 12524.66 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24149.81 23.58 0.00 0.00 10515.43 873.81 10922.67 00:24:32.679 =================================================================================================================== 00:24:32.679 Total : 96439.61 94.18 0.00 0.00 10568.30 873.81 16311.18' 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 04:25:17.427942] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:32.679 [2024-05-15 04:25:17.428028] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3957946 ] 00:24:32.679 Using job config with 4 jobs 00:24:32.679 [2024-05-15 04:25:17.527728] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.679 [2024-05-15 04:25:17.655595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:32.679 cpumask for '\''job0'\'' is too big 00:24:32.679 cpumask for '\''job1'\'' is too big 00:24:32.679 cpumask for '\''job2'\'' is too big 00:24:32.679 cpumask for '\''job3'\'' is too big 00:24:32.679 Running I/O for 2 seconds... 00:24:32.679 00:24:32.679 Latency(us) 00:24:32.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24117.22 23.55 0.00 0.00 10600.87 1917.53 16311.18 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24096.52 23.53 0.00 0.00 10586.21 1868.99 14466.47 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24076.06 23.51 0.00 0.00 10570.96 1868.99 12524.66 00:24:32.679 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:32.679 Malloc0 : 2.02 24149.81 23.58 0.00 0.00 10515.43 873.81 10922.67 00:24:32.679 =================================================================================================================== 00:24:32.679 Total : 96439.61 94.18 0.00 0.00 10568.30 873.81 16311.18' 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:24:32.679 04:25:20 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:32.679 [2024-05-15 04:25:20.239274] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:32.679 [2024-05-15 04:25:20.239355] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3958333 ] 00:24:32.679 [2024-05-15 04:25:20.343436] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:32.679 [2024-05-15 04:25:20.496703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:32.679 cpumask for 'job0' is too big 00:24:32.679 cpumask for 'job1' is too big 00:24:32.679 cpumask for 'job2' is too big 00:24:32.679 cpumask for 'job3' is too big 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:24:35.203 Running I/O for 2 seconds... 00:24:35.203 00:24:35.203 Latency(us) 00:24:35.203 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:35.203 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:35.203 Malloc0 : 2.02 24322.73 23.75 0.00 0.00 10510.40 1917.53 16311.18 00:24:35.203 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:35.203 Malloc0 : 2.02 24301.97 23.73 0.00 0.00 10496.01 1893.26 14369.37 00:24:35.203 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:35.203 Malloc0 : 2.02 24281.57 23.71 0.00 0.00 10480.71 1881.13 12427.57 00:24:35.203 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:24:35.203 Malloc0 : 2.03 24261.03 23.69 0.00 0.00 10465.51 1881.13 10825.58 00:24:35.203 =================================================================================================================== 00:24:35.203 Total : 97167.30 94.89 0.00 0.00 10488.16 1881.13 16311.18' 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:35.203 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:35.203 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:35.203 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:35.203 04:25:23 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:38.482 04:25:25 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-05-15 04:25:23.087969] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:38.482 [2024-05-15 04:25:23.088058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3958607 ] 00:24:38.482 Using job config with 3 jobs 00:24:38.482 [2024-05-15 04:25:23.179531] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:38.482 [2024-05-15 04:25:23.313663] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:38.482 cpumask for '\''job0'\'' is too big 00:24:38.482 cpumask for '\''job1'\'' is too big 00:24:38.482 cpumask for '\''job2'\'' is too big 00:24:38.482 Running I/O for 2 seconds... 00:24:38.482 00:24:38.482 Latency(us) 00:24:38.482 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:38.482 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.482 Malloc0 : 2.01 32865.67 32.10 0.00 0.00 7786.28 1868.99 11553.75 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32878.89 32.11 0.00 0.00 7766.15 1832.58 9709.04 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32851.02 32.08 0.00 0.00 7755.39 1820.44 8058.50 00:24:38.483 =================================================================================================================== 00:24:38.483 Total : 98595.58 96.28 0.00 0.00 7769.25 1820.44 11553.75' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-05-15 04:25:23.087969] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:38.483 [2024-05-15 04:25:23.088058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3958607 ] 00:24:38.483 Using job config with 3 jobs 00:24:38.483 [2024-05-15 04:25:23.179531] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:38.483 [2024-05-15 04:25:23.313663] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:38.483 cpumask for '\''job0'\'' is too big 00:24:38.483 cpumask for '\''job1'\'' is too big 00:24:38.483 cpumask for '\''job2'\'' is too big 00:24:38.483 Running I/O for 2 seconds... 00:24:38.483 00:24:38.483 Latency(us) 00:24:38.483 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.01 32865.67 32.10 0.00 0.00 7786.28 1868.99 11553.75 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32878.89 32.11 0.00 0.00 7766.15 1832.58 9709.04 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32851.02 32.08 0.00 0.00 7755.39 1820.44 8058.50 00:24:38.483 =================================================================================================================== 00:24:38.483 Total : 98595.58 96.28 0.00 0.00 7769.25 1820.44 11553.75' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 04:25:23.087969] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:38.483 [2024-05-15 04:25:23.088058] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3958607 ] 00:24:38.483 Using job config with 3 jobs 00:24:38.483 [2024-05-15 04:25:23.179531] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:38.483 [2024-05-15 04:25:23.313663] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:38.483 cpumask for '\''job0'\'' is too big 00:24:38.483 cpumask for '\''job1'\'' is too big 00:24:38.483 cpumask for '\''job2'\'' is too big 00:24:38.483 Running I/O for 2 seconds... 00:24:38.483 00:24:38.483 Latency(us) 00:24:38.483 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.01 32865.67 32.10 0.00 0.00 7786.28 1868.99 11553.75 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32878.89 32.11 0.00 0.00 7766.15 1832.58 9709.04 00:24:38.483 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:24:38.483 Malloc0 : 2.02 32851.02 32.08 0.00 0.00 7755.39 1820.44 8058.50 00:24:38.483 =================================================================================================================== 00:24:38.483 Total : 98595.58 96.28 0.00 0.00 7769.25 1820.44 11553.75' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:38.483 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:38.483 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:38.483 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:38.483 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:24:38.483 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:24:38.483 04:25:25 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-05-15 04:25:25.905653] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:41.014 [2024-05-15 04:25:25.905740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3959005 ] 00:24:41.014 Using job config with 4 jobs 00:24:41.014 [2024-05-15 04:25:26.011179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:41.014 [2024-05-15 04:25:26.160435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.014 cpumask for '\''job0'\'' is too big 00:24:41.014 cpumask for '\''job1'\'' is too big 00:24:41.014 cpumask for '\''job2'\'' is too big 00:24:41.014 cpumask for '\''job3'\'' is too big 00:24:41.014 Running I/O for 2 seconds... 00:24:41.014 00:24:41.014 Latency(us) 00:24:41.014 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.03 12131.53 11.85 0.00 0.00 21078.86 3907.89 32816.55 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.03 12120.89 11.84 0.00 0.00 21077.68 4636.07 32816.55 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12141.65 11.86 0.00 0.00 20966.84 3835.07 28932.93 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12131.30 11.85 0.00 0.00 20964.66 4587.52 28932.93 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12121.23 11.84 0.00 0.00 20907.61 3835.07 25049.32 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12110.91 11.83 0.00 0.00 20905.68 4587.52 25049.32 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12100.85 11.82 0.00 0.00 20849.71 3859.34 21456.97 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12090.51 11.81 0.00 0.00 20848.00 4587.52 21456.97 00:24:41.014 =================================================================================================================== 00:24:41.014 Total : 96948.88 94.68 0.00 0.00 20949.55 3835.07 32816.55' 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-05-15 04:25:25.905653] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:41.014 [2024-05-15 04:25:25.905740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3959005 ] 00:24:41.014 Using job config with 4 jobs 00:24:41.014 [2024-05-15 04:25:26.011179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:41.014 [2024-05-15 04:25:26.160435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.014 cpumask for '\''job0'\'' is too big 00:24:41.014 cpumask for '\''job1'\'' is too big 00:24:41.014 cpumask for '\''job2'\'' is too big 00:24:41.014 cpumask for '\''job3'\'' is too big 00:24:41.014 Running I/O for 2 seconds... 00:24:41.014 00:24:41.014 Latency(us) 00:24:41.014 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.03 12131.53 11.85 0.00 0.00 21078.86 3907.89 32816.55 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.03 12120.89 11.84 0.00 0.00 21077.68 4636.07 32816.55 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12141.65 11.86 0.00 0.00 20966.84 3835.07 28932.93 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12131.30 11.85 0.00 0.00 20964.66 4587.52 28932.93 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12121.23 11.84 0.00 0.00 20907.61 3835.07 25049.32 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12110.91 11.83 0.00 0.00 20905.68 4587.52 25049.32 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12100.85 11.82 0.00 0.00 20849.71 3859.34 21456.97 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12090.51 11.81 0.00 0.00 20848.00 4587.52 21456.97 00:24:41.014 =================================================================================================================== 00:24:41.014 Total : 96948.88 94.68 0.00 0.00 20949.55 3835.07 32816.55' 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-05-15 04:25:25.905653] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:41.014 [2024-05-15 04:25:25.905740] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3959005 ] 00:24:41.014 Using job config with 4 jobs 00:24:41.014 [2024-05-15 04:25:26.011179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:41.014 [2024-05-15 04:25:26.160435] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.014 cpumask for '\''job0'\'' is too big 00:24:41.014 cpumask for '\''job1'\'' is too big 00:24:41.014 cpumask for '\''job2'\'' is too big 00:24:41.014 cpumask for '\''job3'\'' is too big 00:24:41.014 Running I/O for 2 seconds... 00:24:41.014 00:24:41.014 Latency(us) 00:24:41.014 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.03 12131.53 11.85 0.00 0.00 21078.86 3907.89 32816.55 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.03 12120.89 11.84 0.00 0.00 21077.68 4636.07 32816.55 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12141.65 11.86 0.00 0.00 20966.84 3835.07 28932.93 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12131.30 11.85 0.00 0.00 20964.66 4587.52 28932.93 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12121.23 11.84 0.00 0.00 20907.61 3835.07 25049.32 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12110.91 11.83 0.00 0.00 20905.68 4587.52 25049.32 00:24:41.014 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc0 : 2.05 12100.85 11.82 0.00 0.00 20849.71 3859.34 21456.97 00:24:41.014 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:24:41.014 Malloc1 : 2.05 12090.51 11.81 0.00 0.00 20848.00 4587.52 21456.97 00:24:41.014 =================================================================================================================== 00:24:41.014 Total : 96948.88 94.68 0.00 0.00 20949.55 3835.07 32816.55' 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:24:41.014 04:25:28 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:24:41.014 00:24:41.014 real 0m11.421s 00:24:41.014 user 0m10.166s 00:24:41.014 sys 0m1.048s 00:24:41.014 04:25:28 bdevperf_config -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:41.014 04:25:28 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:24:41.014 ************************************ 00:24:41.014 END TEST bdevperf_config 00:24:41.014 ************************************ 00:24:41.014 04:25:28 -- spdk/autotest.sh@188 -- # uname -s 00:24:41.014 04:25:28 -- spdk/autotest.sh@188 -- # [[ Linux == Linux ]] 00:24:41.014 04:25:28 -- spdk/autotest.sh@189 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:41.014 04:25:28 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:24:41.014 04:25:28 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:41.014 04:25:28 -- common/autotest_common.sh@10 -- # set +x 00:24:41.014 ************************************ 00:24:41.014 START TEST reactor_set_interrupt 00:24:41.014 ************************************ 00:24:41.014 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:41.014 * Looking for test storage... 00:24:41.015 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:41.015 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:24:41.015 04:25:28 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:24:41.015 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:24:41.015 04:25:28 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:24:41.015 #define SPDK_CONFIG_H 00:24:41.015 #define SPDK_CONFIG_APPS 1 00:24:41.015 #define SPDK_CONFIG_ARCH native 00:24:41.015 #undef SPDK_CONFIG_ASAN 00:24:41.015 #undef SPDK_CONFIG_AVAHI 00:24:41.016 #undef SPDK_CONFIG_CET 00:24:41.016 #define SPDK_CONFIG_COVERAGE 1 00:24:41.016 #define SPDK_CONFIG_CROSS_PREFIX 00:24:41.016 #define SPDK_CONFIG_CRYPTO 1 00:24:41.016 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:24:41.016 #undef SPDK_CONFIG_CUSTOMOCF 00:24:41.016 #undef SPDK_CONFIG_DAOS 00:24:41.016 #define SPDK_CONFIG_DAOS_DIR 00:24:41.016 #define SPDK_CONFIG_DEBUG 1 00:24:41.016 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:24:41.016 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:41.016 #define SPDK_CONFIG_DPDK_INC_DIR 00:24:41.016 #define SPDK_CONFIG_DPDK_LIB_DIR 00:24:41.016 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:24:41.016 #undef SPDK_CONFIG_DPDK_UADK 00:24:41.016 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:41.016 #define SPDK_CONFIG_EXAMPLES 1 00:24:41.016 #undef SPDK_CONFIG_FC 00:24:41.016 #define SPDK_CONFIG_FC_PATH 00:24:41.016 #define SPDK_CONFIG_FIO_PLUGIN 1 00:24:41.016 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:24:41.016 #undef SPDK_CONFIG_FUSE 00:24:41.016 #undef SPDK_CONFIG_FUZZER 00:24:41.016 #define SPDK_CONFIG_FUZZER_LIB 00:24:41.016 #undef SPDK_CONFIG_GOLANG 00:24:41.016 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:24:41.016 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:24:41.016 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:24:41.016 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:24:41.016 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:24:41.016 #undef SPDK_CONFIG_HAVE_LIBBSD 00:24:41.016 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:24:41.016 #define SPDK_CONFIG_IDXD 1 00:24:41.016 #undef SPDK_CONFIG_IDXD_KERNEL 00:24:41.016 #define SPDK_CONFIG_IPSEC_MB 1 00:24:41.016 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:41.016 #define SPDK_CONFIG_ISAL 1 00:24:41.016 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:24:41.016 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:24:41.016 #define SPDK_CONFIG_LIBDIR 00:24:41.016 #undef SPDK_CONFIG_LTO 00:24:41.016 #define SPDK_CONFIG_MAX_LCORES 00:24:41.016 #define SPDK_CONFIG_NVME_CUSE 1 00:24:41.016 #undef SPDK_CONFIG_OCF 00:24:41.016 #define SPDK_CONFIG_OCF_PATH 00:24:41.016 #define SPDK_CONFIG_OPENSSL_PATH 00:24:41.016 #undef SPDK_CONFIG_PGO_CAPTURE 00:24:41.016 #define SPDK_CONFIG_PGO_DIR 00:24:41.016 #undef SPDK_CONFIG_PGO_USE 00:24:41.016 #define SPDK_CONFIG_PREFIX /usr/local 00:24:41.016 #undef SPDK_CONFIG_RAID5F 00:24:41.016 #undef SPDK_CONFIG_RBD 00:24:41.016 #define SPDK_CONFIG_RDMA 1 00:24:41.016 #define SPDK_CONFIG_RDMA_PROV verbs 00:24:41.016 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:24:41.016 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:24:41.016 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:24:41.016 #define SPDK_CONFIG_SHARED 1 00:24:41.016 #undef SPDK_CONFIG_SMA 00:24:41.016 #define SPDK_CONFIG_TESTS 1 00:24:41.016 #undef SPDK_CONFIG_TSAN 00:24:41.016 #define SPDK_CONFIG_UBLK 1 00:24:41.016 #define SPDK_CONFIG_UBSAN 1 00:24:41.016 #undef SPDK_CONFIG_UNIT_TESTS 00:24:41.016 #undef SPDK_CONFIG_URING 00:24:41.016 #define SPDK_CONFIG_URING_PATH 00:24:41.016 #undef SPDK_CONFIG_URING_ZNS 00:24:41.016 #undef SPDK_CONFIG_USDT 00:24:41.016 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:24:41.016 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:24:41.016 #undef SPDK_CONFIG_VFIO_USER 00:24:41.016 #define SPDK_CONFIG_VFIO_USER_DIR 00:24:41.016 #define SPDK_CONFIG_VHOST 1 00:24:41.016 #define SPDK_CONFIG_VIRTIO 1 00:24:41.016 #undef SPDK_CONFIG_VTUNE 00:24:41.016 #define SPDK_CONFIG_VTUNE_DIR 00:24:41.016 #define SPDK_CONFIG_WERROR 1 00:24:41.016 #define SPDK_CONFIG_WPDK_DIR 00:24:41.016 #undef SPDK_CONFIG_XNVME 00:24:41.016 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:41.016 04:25:28 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:41.016 04:25:28 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:41.016 04:25:28 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:41.016 04:25:28 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:41.016 04:25:28 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:24:41.016 04:25:28 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:24:41.016 04:25:28 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@57 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@61 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@63 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@65 -- # : 1 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@67 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@69 -- # : 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@71 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@73 -- # : 1 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@75 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@77 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@79 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@81 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@83 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@85 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@87 -- # : 0 00:24:41.016 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@89 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@91 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@93 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@95 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@97 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@99 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@101 -- # : rdma 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@103 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@105 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@107 -- # : 1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@109 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@111 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@113 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@115 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@117 -- # : 1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@119 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@121 -- # : 1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@123 -- # : 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@125 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@127 -- # : 1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@129 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@131 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@133 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@135 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@137 -- # : 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@139 -- # : true 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@141 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@143 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@145 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@147 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@149 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@151 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@153 -- # : 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@155 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@157 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@159 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@161 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@163 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@168 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@170 -- # : 0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@199 -- # cat 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:24:41.017 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@262 -- # export valgrind= 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@262 -- # valgrind= 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@268 -- # uname -s 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@278 -- # MAKE=make 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j48 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@298 -- # TEST_MODE= 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@317 -- # [[ -z 3959318 ]] 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@317 -- # kill -0 3959318 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local mount target_dir 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.ipFd19 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.ipFd19/tests/interrupt /tmp/spdk.ipFd19 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@326 -- # df -T 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=969003008 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4315426816 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=55821103104 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=61994737664 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=6173634560 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=30992658432 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=30997368832 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=12389998592 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=12398948352 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=8949760 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=30996676608 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=30997368832 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=692224 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:41.018 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # avails["$mount"]=6199468032 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@361 -- # sizes["$mount"]=6199472128 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:24:41.019 * Looking for test storage... 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@367 -- # local target_space new_size 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@371 -- # mount=/ 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@373 -- # target_space=55821103104 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@380 -- # new_size=8388227072 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.019 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@388 -- # return 0 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1678 -- # set -o errtrace 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # true 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # xtrace_fd 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3959362 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:41.019 04:25:28 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3959362 /var/tmp/spdk.sock 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 3959362 ']' 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:41.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:41.019 04:25:28 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:41.019 [2024-05-15 04:25:28.973609] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:41.019 [2024-05-15 04:25:28.973675] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3959362 ] 00:24:41.300 [2024-05-15 04:25:29.055180] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:41.300 [2024-05-15 04:25:29.167550] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:41.300 [2024-05-15 04:25:29.167601] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:41.300 [2024-05-15 04:25:29.167605] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:41.300 [2024-05-15 04:25:29.260247] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:41.300 04:25:29 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:41.300 04:25:29 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:24:41.300 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:24:41.300 04:25:29 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:41.903 Malloc0 00:24:41.903 Malloc1 00:24:41.903 Malloc2 00:24:41.903 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:24:41.903 04:25:29 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:24:41.903 04:25:29 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:41.903 04:25:29 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:41.903 5000+0 records in 00:24:41.903 5000+0 records out 00:24:41.903 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0139096 s, 736 MB/s 00:24:41.903 04:25:29 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:42.161 AIO0 00:24:42.161 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 3959362 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 3959362 without_thd 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3959362 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:42.162 04:25:29 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:42.420 04:25:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:24:42.678 spdk_thread ids are 1 on reactor0. 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3959362 0 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3959362 0 idle 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:42.678 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959362 root 20 0 128.2g 36480 23424 S 6.7 0.1 0:00.36 reactor_0' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959362 root 20 0 128.2g 36480 23424 S 6.7 0.1 0:00.36 reactor_0 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3959362 1 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3959362 1 idle 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959412 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.00 reactor_1' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959412 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.00 reactor_1 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3959362 2 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3959362 2 idle 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:42.937 04:25:30 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:43.195 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959413 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.00 reactor_2' 00:24:43.195 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959413 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.00 reactor_2 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:24:43.196 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:24:43.453 [2024-05-15 04:25:31.256386] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:43.453 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:24:43.711 [2024-05-15 04:25:31.504169] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:24:43.711 [2024-05-15 04:25:31.504383] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:43.711 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:24:43.969 [2024-05-15 04:25:31.748152] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:24:43.969 [2024-05-15 04:25:31.748286] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3959362 0 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3959362 0 busy 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959362 root 20 0 128.2g 36480 23424 R 99.9 0.1 0:00.79 reactor_0' 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959362 root 20 0 128.2g 36480 23424 R 99.9 0.1 0:00.79 reactor_0 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3959362 2 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3959362 2 busy 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:43.969 04:25:31 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959413 root 20 0 128.2g 36480 23424 R 99.9 0.1 0:00.34 reactor_2' 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959413 root 20 0 128.2g 36480 23424 R 99.9 0.1 0:00.34 reactor_2 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:44.227 04:25:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:44.228 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:24:44.486 [2024-05-15 04:25:32.332184] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:24:44.486 [2024-05-15 04:25:32.332312] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3959362 2 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3959362 2 idle 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:44.486 04:25:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959413 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.58 reactor_2' 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959413 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:00.58 reactor_2 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:44.744 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:24:44.744 [2024-05-15 04:25:32.752169] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:24:44.744 [2024-05-15 04:25:32.752358] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:45.001 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:24:45.001 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:24:45.001 04:25:32 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:24:45.001 [2024-05-15 04:25:33.004342] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3959362 0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3959362 0 idle 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3959362 -w 256 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3959362 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:01.61 reactor_0' 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3959362 root 20 0 128.2g 36480 23424 S 0.0 0.1 0:01.61 reactor_0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:24:45.259 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 3959362 ']' 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3959362' 00:24:45.259 killing process with pid 3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 3959362 00:24:45.259 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 3959362 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3960015 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:45.517 04:25:33 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3960015 /var/tmp/spdk.sock 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@827 -- # '[' -z 3960015 ']' 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:45.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:45.517 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:45.774 [2024-05-15 04:25:33.560207] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:45.774 [2024-05-15 04:25:33.560280] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3960015 ] 00:24:45.774 [2024-05-15 04:25:33.643071] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:45.774 [2024-05-15 04:25:33.760001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:45.774 [2024-05-15 04:25:33.760058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:45.774 [2024-05-15 04:25:33.760076] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:46.033 [2024-05-15 04:25:33.850425] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:46.033 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:46.033 04:25:33 reactor_set_interrupt -- common/autotest_common.sh@860 -- # return 0 00:24:46.033 04:25:33 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:24:46.033 04:25:33 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:46.291 Malloc0 00:24:46.291 Malloc1 00:24:46.291 Malloc2 00:24:46.291 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:24:46.291 04:25:34 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:24:46.291 04:25:34 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:46.291 04:25:34 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:46.291 5000+0 records in 00:24:46.291 5000+0 records out 00:24:46.291 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0117464 s, 872 MB/s 00:24:46.291 04:25:34 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:46.549 AIO0 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 3960015 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 3960015 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=3960015 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:46.549 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:24:46.807 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:24:47.065 spdk_thread ids are 1 on reactor0. 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3960015 0 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3960015 0 idle 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:47.065 04:25:34 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960015 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.36 reactor_0' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960015 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.36 reactor_0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3960015 1 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3960015 1 idle 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960085 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.00 reactor_1' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960085 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.00 reactor_1 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 3960015 2 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3960015 2 idle 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:47.324 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960086 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.00 reactor_2' 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960086 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.00 reactor_2 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:47.582 04:25:35 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:47.583 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:24:47.583 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:24:47.842 [2024-05-15 04:25:35.744747] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:24:47.842 [2024-05-15 04:25:35.744883] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:24:47.842 [2024-05-15 04:25:35.744984] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:47.842 04:25:35 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:24:48.100 [2024-05-15 04:25:36.005409] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:24:48.100 [2024-05-15 04:25:36.005637] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3960015 0 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3960015 0 busy 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:48.100 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960015 root 20 0 128.2g 37632 24192 R 99.9 0.1 0:00.80 reactor_0' 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960015 root 20 0 128.2g 37632 24192 R 99.9 0.1 0:00.80 reactor_0 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 3960015 2 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 3960015 2 busy 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:48.358 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960086 root 20 0 128.2g 37632 24192 R 99.9 0.1 0:00.34 reactor_2' 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960086 root 20 0 128.2g 37632 24192 R 99.9 0.1 0:00.34 reactor_2 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:48.359 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:24:48.617 [2024-05-15 04:25:36.631144] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:24:48.617 [2024-05-15 04:25:36.631326] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 3960015 2 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3960015 2 idle 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960086 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.62 reactor_2' 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960086 root 20 0 128.2g 37632 24192 S 0.0 0.1 0:00.62 reactor_2 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:48.876 04:25:36 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:24:49.134 [2024-05-15 04:25:37.064278] interrupt_tgt.c: 61:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:24:49.134 [2024-05-15 04:25:37.064524] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:24:49.134 [2024-05-15 04:25:37.064562] interrupt_tgt.c: 32:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 3960015 0 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 3960015 0 idle 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=3960015 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 3960015 -w 256 00:24:49.134 04:25:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:24:49.392 04:25:37 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='3960015 root 20 0 128.2g 37632 24192 S 6.7 0.1 0:01.68 reactor_0' 00:24:49.392 04:25:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 3960015 root 20 0 128.2g 37632 24192 S 6.7 0.1 0:01.68 reactor_0 00:24:49.392 04:25:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:24:49.392 04:25:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:24:49.392 04:25:37 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:24:49.393 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 3960015 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@946 -- # '[' -z 3960015 ']' 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@950 -- # kill -0 3960015 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@951 -- # uname 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3960015 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3960015' 00:24:49.393 killing process with pid 3960015 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@965 -- # kill 3960015 00:24:49.393 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@970 -- # wait 3960015 00:24:49.651 04:25:37 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:24:49.651 04:25:37 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:49.651 00:24:49.651 real 0m8.797s 00:24:49.651 user 0m9.573s 00:24:49.651 sys 0m1.526s 00:24:49.651 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:49.651 04:25:37 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:24:49.651 ************************************ 00:24:49.651 END TEST reactor_set_interrupt 00:24:49.651 ************************************ 00:24:49.651 04:25:37 -- spdk/autotest.sh@190 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:49.651 04:25:37 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:24:49.651 04:25:37 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:49.651 04:25:37 -- common/autotest_common.sh@10 -- # set +x 00:24:49.651 ************************************ 00:24:49.651 START TEST reap_unregistered_poller 00:24:49.651 ************************************ 00:24:49.651 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:49.912 * Looking for test storage... 00:24:49.912 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:49.912 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:24:49.912 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:24:49.912 04:25:37 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:24:49.913 04:25:37 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:24:49.913 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:24:49.913 04:25:37 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:24:49.913 #define SPDK_CONFIG_H 00:24:49.913 #define SPDK_CONFIG_APPS 1 00:24:49.913 #define SPDK_CONFIG_ARCH native 00:24:49.913 #undef SPDK_CONFIG_ASAN 00:24:49.913 #undef SPDK_CONFIG_AVAHI 00:24:49.913 #undef SPDK_CONFIG_CET 00:24:49.913 #define SPDK_CONFIG_COVERAGE 1 00:24:49.913 #define SPDK_CONFIG_CROSS_PREFIX 00:24:49.913 #define SPDK_CONFIG_CRYPTO 1 00:24:49.913 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:24:49.913 #undef SPDK_CONFIG_CUSTOMOCF 00:24:49.913 #undef SPDK_CONFIG_DAOS 00:24:49.913 #define SPDK_CONFIG_DAOS_DIR 00:24:49.913 #define SPDK_CONFIG_DEBUG 1 00:24:49.913 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:24:49.913 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:24:49.913 #define SPDK_CONFIG_DPDK_INC_DIR 00:24:49.913 #define SPDK_CONFIG_DPDK_LIB_DIR 00:24:49.913 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:24:49.913 #undef SPDK_CONFIG_DPDK_UADK 00:24:49.913 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:24:49.913 #define SPDK_CONFIG_EXAMPLES 1 00:24:49.913 #undef SPDK_CONFIG_FC 00:24:49.913 #define SPDK_CONFIG_FC_PATH 00:24:49.913 #define SPDK_CONFIG_FIO_PLUGIN 1 00:24:49.913 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:24:49.913 #undef SPDK_CONFIG_FUSE 00:24:49.913 #undef SPDK_CONFIG_FUZZER 00:24:49.913 #define SPDK_CONFIG_FUZZER_LIB 00:24:49.913 #undef SPDK_CONFIG_GOLANG 00:24:49.913 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:24:49.913 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:24:49.913 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:24:49.913 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:24:49.913 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:24:49.913 #undef SPDK_CONFIG_HAVE_LIBBSD 00:24:49.913 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:24:49.913 #define SPDK_CONFIG_IDXD 1 00:24:49.913 #undef SPDK_CONFIG_IDXD_KERNEL 00:24:49.913 #define SPDK_CONFIG_IPSEC_MB 1 00:24:49.913 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:24:49.913 #define SPDK_CONFIG_ISAL 1 00:24:49.913 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:24:49.913 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:24:49.913 #define SPDK_CONFIG_LIBDIR 00:24:49.913 #undef SPDK_CONFIG_LTO 00:24:49.914 #define SPDK_CONFIG_MAX_LCORES 00:24:49.914 #define SPDK_CONFIG_NVME_CUSE 1 00:24:49.914 #undef SPDK_CONFIG_OCF 00:24:49.914 #define SPDK_CONFIG_OCF_PATH 00:24:49.914 #define SPDK_CONFIG_OPENSSL_PATH 00:24:49.914 #undef SPDK_CONFIG_PGO_CAPTURE 00:24:49.914 #define SPDK_CONFIG_PGO_DIR 00:24:49.914 #undef SPDK_CONFIG_PGO_USE 00:24:49.914 #define SPDK_CONFIG_PREFIX /usr/local 00:24:49.914 #undef SPDK_CONFIG_RAID5F 00:24:49.914 #undef SPDK_CONFIG_RBD 00:24:49.914 #define SPDK_CONFIG_RDMA 1 00:24:49.914 #define SPDK_CONFIG_RDMA_PROV verbs 00:24:49.914 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:24:49.914 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:24:49.914 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:24:49.914 #define SPDK_CONFIG_SHARED 1 00:24:49.914 #undef SPDK_CONFIG_SMA 00:24:49.914 #define SPDK_CONFIG_TESTS 1 00:24:49.914 #undef SPDK_CONFIG_TSAN 00:24:49.914 #define SPDK_CONFIG_UBLK 1 00:24:49.914 #define SPDK_CONFIG_UBSAN 1 00:24:49.914 #undef SPDK_CONFIG_UNIT_TESTS 00:24:49.914 #undef SPDK_CONFIG_URING 00:24:49.914 #define SPDK_CONFIG_URING_PATH 00:24:49.914 #undef SPDK_CONFIG_URING_ZNS 00:24:49.914 #undef SPDK_CONFIG_USDT 00:24:49.914 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:24:49.914 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:24:49.914 #undef SPDK_CONFIG_VFIO_USER 00:24:49.914 #define SPDK_CONFIG_VFIO_USER_DIR 00:24:49.914 #define SPDK_CONFIG_VHOST 1 00:24:49.914 #define SPDK_CONFIG_VIRTIO 1 00:24:49.914 #undef SPDK_CONFIG_VTUNE 00:24:49.914 #define SPDK_CONFIG_VTUNE_DIR 00:24:49.914 #define SPDK_CONFIG_WERROR 1 00:24:49.914 #define SPDK_CONFIG_WPDK_DIR 00:24:49.914 #undef SPDK_CONFIG_XNVME 00:24:49.914 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:49.914 04:25:37 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:49.914 04:25:37 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:49.914 04:25:37 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:49.914 04:25:37 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:49.914 04:25:37 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:24:49.914 04:25:37 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:24:49.914 04:25:37 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@57 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@61 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@63 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@65 -- # : 1 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@67 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@69 -- # : 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@71 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@73 -- # : 1 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@75 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@77 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@79 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@81 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@83 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@85 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@87 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@89 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@91 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@93 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@95 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@97 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@99 -- # : 0 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@101 -- # : rdma 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:24:49.914 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@103 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@105 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@107 -- # : 1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@109 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@111 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@113 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@115 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@117 -- # : 1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@119 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@121 -- # : 1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@123 -- # : 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@125 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@127 -- # : 1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@129 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@131 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@133 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@135 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@137 -- # : 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@139 -- # : true 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@141 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@143 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@145 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@147 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@149 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@151 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@153 -- # : 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@155 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@157 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@159 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@161 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@163 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@168 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@170 -- # : 0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@199 -- # cat 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:49.915 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@262 -- # export valgrind= 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@262 -- # valgrind= 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@268 -- # uname -s 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@271 -- # [[ 1 -eq 1 ]] 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@275 -- # export HUGE_EVEN_ALLOC=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@275 -- # HUGE_EVEN_ALLOC=yes 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@278 -- # MAKE=make 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j48 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@298 -- # TEST_MODE= 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@317 -- # [[ -z 3960590 ]] 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@317 -- # kill -0 3960590 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local mount target_dir 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.omSXwp 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.omSXwp/tests/interrupt /tmp/spdk.omSXwp 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@326 -- # df -T 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=969003008 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4315426816 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=55820967936 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=61994737664 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=6173769728 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=30992658432 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=30997368832 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4710400 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=12389998592 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=12398948352 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=8949760 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=30996676608 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=30997368832 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=692224 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # avails["$mount"]=6199468032 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@361 -- # sizes["$mount"]=6199472128 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:24:49.916 * Looking for test storage... 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@367 -- # local target_space new_size 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@371 -- # mount=/ 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@373 -- # target_space=55820967936 00:24:49.916 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@380 -- # new_size=8388362240 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.917 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@388 -- # return 0 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1678 -- # set -o errtrace 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # true 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # xtrace_fd 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=3960631 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:49.917 04:25:37 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 3960631 /var/tmp/spdk.sock 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@827 -- # '[' -z 3960631 ']' 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:49.917 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:49.917 04:25:37 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:49.917 [2024-05-15 04:25:37.830858] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:49.917 [2024-05-15 04:25:37.830946] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3960631 ] 00:24:49.917 [2024-05-15 04:25:37.910041] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:24:50.176 [2024-05-15 04:25:38.028250] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:50.176 [2024-05-15 04:25:38.030844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:50.176 [2024-05-15 04:25:38.030856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:24:50.176 [2024-05-15 04:25:38.115160] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:24:50.176 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:50.176 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@860 -- # return 0 00:24:50.176 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:24:50.176 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:24:50.176 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:24:50.176 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:50.176 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:24:50.176 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:24:50.176 "name": "app_thread", 00:24:50.176 "id": 1, 00:24:50.176 "active_pollers": [], 00:24:50.176 "timed_pollers": [ 00:24:50.176 { 00:24:50.176 "name": "rpc_subsystem_poll_servers", 00:24:50.176 "id": 1, 00:24:50.176 "state": "waiting", 00:24:50.176 "run_count": 0, 00:24:50.176 "busy_count": 0, 00:24:50.176 "period_ticks": 10800000 00:24:50.176 } 00:24:50.176 ], 00:24:50.176 "paused_pollers": [] 00:24:50.176 }' 00:24:50.176 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:24:50.472 5000+0 records in 00:24:50.472 5000+0 records out 00:24:50.472 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0117679 s, 870 MB/s 00:24:50.472 04:25:38 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:24:50.731 AIO0 00:24:50.731 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:24:50.989 "name": "app_thread", 00:24:50.989 "id": 1, 00:24:50.989 "active_pollers": [], 00:24:50.989 "timed_pollers": [ 00:24:50.989 { 00:24:50.989 "name": "rpc_subsystem_poll_servers", 00:24:50.989 "id": 1, 00:24:50.989 "state": "waiting", 00:24:50.989 "run_count": 0, 00:24:50.989 "busy_count": 0, 00:24:50.989 "period_ticks": 10800000 00:24:50.989 } 00:24:50.989 ], 00:24:50.989 "paused_pollers": [] 00:24:50.989 }' 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:24:50.989 04:25:38 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 3960631 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@946 -- # '[' -z 3960631 ']' 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@950 -- # kill -0 3960631 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@951 -- # uname 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:24:50.989 04:25:38 reap_unregistered_poller -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3960631 00:24:51.248 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:24:51.248 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:24:51.248 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3960631' 00:24:51.248 killing process with pid 3960631 00:24:51.248 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@965 -- # kill 3960631 00:24:51.248 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@970 -- # wait 3960631 00:24:51.506 04:25:39 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:24:51.506 04:25:39 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:24:51.506 00:24:51.506 real 0m1.625s 00:24:51.506 user 0m1.327s 00:24:51.506 sys 0m0.428s 00:24:51.506 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@1122 -- # xtrace_disable 00:24:51.506 04:25:39 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:24:51.506 ************************************ 00:24:51.506 END TEST reap_unregistered_poller 00:24:51.506 ************************************ 00:24:51.506 04:25:39 -- spdk/autotest.sh@194 -- # uname -s 00:24:51.506 04:25:39 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:24:51.506 04:25:39 -- spdk/autotest.sh@195 -- # [[ 1 -eq 1 ]] 00:24:51.506 04:25:39 -- spdk/autotest.sh@201 -- # [[ 1 -eq 0 ]] 00:24:51.506 04:25:39 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@256 -- # timing_exit lib 00:24:51.506 04:25:39 -- common/autotest_common.sh@726 -- # xtrace_disable 00:24:51.506 04:25:39 -- common/autotest_common.sh@10 -- # set +x 00:24:51.506 04:25:39 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@275 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@304 -- # '[' 0 -eq 1 ']' 00:24:51.506 04:25:39 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@317 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@326 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@331 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@343 -- # '[' 1 -eq 1 ']' 00:24:51.507 04:25:39 -- spdk/autotest.sh@344 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:24:51.507 04:25:39 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:24:51.507 04:25:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:24:51.507 04:25:39 -- common/autotest_common.sh@10 -- # set +x 00:24:51.507 ************************************ 00:24:51.507 START TEST compress_compdev 00:24:51.507 ************************************ 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:24:51.507 * Looking for test storage... 00:24:51.507 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:24:51.507 04:25:39 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:24:51.507 04:25:39 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:24:51.507 04:25:39 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:24:51.507 04:25:39 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:51.507 04:25:39 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:51.507 04:25:39 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:51.507 04:25:39 compress_compdev -- paths/export.sh@5 -- # export PATH 00:24:51.507 04:25:39 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:24:51.507 04:25:39 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3960979 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:24:51.507 04:25:39 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3960979 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 3960979 ']' 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:51.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:24:51.507 04:25:39 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:24:51.507 [2024-05-15 04:25:39.485183] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:24:51.507 [2024-05-15 04:25:39.485254] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3960979 ] 00:24:51.765 [2024-05-15 04:25:39.561341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:24:51.765 [2024-05-15 04:25:39.671240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:24:51.765 [2024-05-15 04:25:39.671245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:24:52.331 [2024-05-15 04:25:40.291246] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:24:52.588 04:25:40 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:24:52.588 04:25:40 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:24:52.588 04:25:40 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:24:52.588 04:25:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:24:52.588 04:25:40 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:24:55.867 [2024-05-15 04:25:43.539508] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1156840 PMD being used: compress_qat 00:24:55.867 04:25:43 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:55.867 04:25:43 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:24:56.125 [ 00:24:56.125 { 00:24:56.125 "name": "Nvme0n1", 00:24:56.125 "aliases": [ 00:24:56.125 "337e41c0-ba10-4264-b890-4f8d54ea1c6c" 00:24:56.125 ], 00:24:56.125 "product_name": "NVMe disk", 00:24:56.125 "block_size": 512, 00:24:56.125 "num_blocks": 3907029168, 00:24:56.125 "uuid": "337e41c0-ba10-4264-b890-4f8d54ea1c6c", 00:24:56.125 "assigned_rate_limits": { 00:24:56.125 "rw_ios_per_sec": 0, 00:24:56.125 "rw_mbytes_per_sec": 0, 00:24:56.125 "r_mbytes_per_sec": 0, 00:24:56.125 "w_mbytes_per_sec": 0 00:24:56.125 }, 00:24:56.125 "claimed": false, 00:24:56.125 "zoned": false, 00:24:56.125 "supported_io_types": { 00:24:56.125 "read": true, 00:24:56.125 "write": true, 00:24:56.125 "unmap": true, 00:24:56.125 "write_zeroes": true, 00:24:56.125 "flush": true, 00:24:56.125 "reset": true, 00:24:56.125 "compare": false, 00:24:56.125 "compare_and_write": false, 00:24:56.125 "abort": true, 00:24:56.125 "nvme_admin": true, 00:24:56.125 "nvme_io": true 00:24:56.125 }, 00:24:56.125 "driver_specific": { 00:24:56.125 "nvme": [ 00:24:56.125 { 00:24:56.125 "pci_address": "0000:81:00.0", 00:24:56.125 "trid": { 00:24:56.125 "trtype": "PCIe", 00:24:56.125 "traddr": "0000:81:00.0" 00:24:56.125 }, 00:24:56.125 "ctrlr_data": { 00:24:56.125 "cntlid": 0, 00:24:56.125 "vendor_id": "0x8086", 00:24:56.125 "model_number": "INTEL SSDPE2KX020T8", 00:24:56.125 "serial_number": "PHLJ951302VM2P0BGN", 00:24:56.125 "firmware_revision": "VDV10184", 00:24:56.125 "oacs": { 00:24:56.125 "security": 0, 00:24:56.125 "format": 1, 00:24:56.125 "firmware": 1, 00:24:56.125 "ns_manage": 1 00:24:56.125 }, 00:24:56.125 "multi_ctrlr": false, 00:24:56.125 "ana_reporting": false 00:24:56.125 }, 00:24:56.125 "vs": { 00:24:56.125 "nvme_version": "1.2" 00:24:56.125 }, 00:24:56.125 "ns_data": { 00:24:56.125 "id": 1, 00:24:56.125 "can_share": false 00:24:56.125 } 00:24:56.125 } 00:24:56.125 ], 00:24:56.125 "mp_policy": "active_passive" 00:24:56.125 } 00:24:56.125 } 00:24:56.125 ] 00:24:56.125 04:25:44 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:24:56.125 04:25:44 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:24:56.383 [2024-05-15 04:25:44.297034] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfa49a0 PMD being used: compress_qat 00:24:57.317 2dfb41d9-2c84-4b13-ad72-4a895ec00a10 00:24:57.317 04:25:45 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:24:57.883 9528748f-9f20-4539-83c9-349041f13881 00:24:57.883 04:25:45 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:57.883 04:25:45 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:24:58.141 [ 00:24:58.141 { 00:24:58.141 "name": "9528748f-9f20-4539-83c9-349041f13881", 00:24:58.141 "aliases": [ 00:24:58.141 "lvs0/lv0" 00:24:58.141 ], 00:24:58.141 "product_name": "Logical Volume", 00:24:58.141 "block_size": 512, 00:24:58.141 "num_blocks": 204800, 00:24:58.141 "uuid": "9528748f-9f20-4539-83c9-349041f13881", 00:24:58.141 "assigned_rate_limits": { 00:24:58.141 "rw_ios_per_sec": 0, 00:24:58.141 "rw_mbytes_per_sec": 0, 00:24:58.141 "r_mbytes_per_sec": 0, 00:24:58.141 "w_mbytes_per_sec": 0 00:24:58.141 }, 00:24:58.141 "claimed": false, 00:24:58.141 "zoned": false, 00:24:58.141 "supported_io_types": { 00:24:58.141 "read": true, 00:24:58.141 "write": true, 00:24:58.141 "unmap": true, 00:24:58.141 "write_zeroes": true, 00:24:58.141 "flush": false, 00:24:58.141 "reset": true, 00:24:58.141 "compare": false, 00:24:58.141 "compare_and_write": false, 00:24:58.141 "abort": false, 00:24:58.141 "nvme_admin": false, 00:24:58.141 "nvme_io": false 00:24:58.141 }, 00:24:58.141 "driver_specific": { 00:24:58.141 "lvol": { 00:24:58.141 "lvol_store_uuid": "2dfb41d9-2c84-4b13-ad72-4a895ec00a10", 00:24:58.141 "base_bdev": "Nvme0n1", 00:24:58.141 "thin_provision": true, 00:24:58.141 "num_allocated_clusters": 0, 00:24:58.141 "snapshot": false, 00:24:58.141 "clone": false, 00:24:58.141 "esnap_clone": false 00:24:58.141 } 00:24:58.141 } 00:24:58.141 } 00:24:58.141 ] 00:24:58.141 04:25:46 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:24:58.141 04:25:46 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:24:58.141 04:25:46 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:24:58.399 [2024-05-15 04:25:46.334910] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:24:58.399 COMP_lvs0/lv0 00:24:58.399 04:25:46 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:24:58.399 04:25:46 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:24:58.657 04:25:46 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:24:58.915 [ 00:24:58.915 { 00:24:58.915 "name": "COMP_lvs0/lv0", 00:24:58.915 "aliases": [ 00:24:58.915 "e2bb493b-6ade-5791-9ed9-7195c4f71a7b" 00:24:58.915 ], 00:24:58.915 "product_name": "compress", 00:24:58.915 "block_size": 512, 00:24:58.915 "num_blocks": 200704, 00:24:58.915 "uuid": "e2bb493b-6ade-5791-9ed9-7195c4f71a7b", 00:24:58.915 "assigned_rate_limits": { 00:24:58.915 "rw_ios_per_sec": 0, 00:24:58.915 "rw_mbytes_per_sec": 0, 00:24:58.915 "r_mbytes_per_sec": 0, 00:24:58.915 "w_mbytes_per_sec": 0 00:24:58.915 }, 00:24:58.915 "claimed": false, 00:24:58.915 "zoned": false, 00:24:58.915 "supported_io_types": { 00:24:58.915 "read": true, 00:24:58.915 "write": true, 00:24:58.915 "unmap": false, 00:24:58.915 "write_zeroes": true, 00:24:58.915 "flush": false, 00:24:58.915 "reset": false, 00:24:58.915 "compare": false, 00:24:58.915 "compare_and_write": false, 00:24:58.915 "abort": false, 00:24:58.915 "nvme_admin": false, 00:24:58.915 "nvme_io": false 00:24:58.915 }, 00:24:58.915 "driver_specific": { 00:24:58.915 "compress": { 00:24:58.915 "name": "COMP_lvs0/lv0", 00:24:58.915 "base_bdev_name": "9528748f-9f20-4539-83c9-349041f13881" 00:24:58.915 } 00:24:58.915 } 00:24:58.915 } 00:24:58.915 ] 00:24:58.915 04:25:46 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:24:58.915 04:25:46 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:24:58.915 [2024-05-15 04:25:46.921550] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fe20c1b15a0 PMD being used: compress_qat 00:24:58.915 [2024-05-15 04:25:46.923431] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x13491c0 PMD being used: compress_qat 00:24:58.915 Running I/O for 3 seconds... 00:25:02.195 00:25:02.195 Latency(us) 00:25:02.195 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:02.195 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:02.195 Verification LBA range: start 0x0 length 0x3100 00:25:02.195 COMP_lvs0/lv0 : 3.01 3397.56 13.27 0.00 0.00 9377.24 144.12 17282.09 00:25:02.195 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:02.195 Verification LBA range: start 0x3100 length 0x3100 00:25:02.195 COMP_lvs0/lv0 : 3.01 3460.58 13.52 0.00 0.00 9205.74 133.50 16505.36 00:25:02.195 =================================================================================================================== 00:25:02.195 Total : 6858.14 26.79 0.00 0.00 9290.69 133.50 17282.09 00:25:02.195 0 00:25:02.195 04:25:49 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:25:02.195 04:25:49 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:02.195 04:25:50 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:02.452 04:25:50 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:02.452 04:25:50 compress_compdev -- compress/compress.sh@78 -- # killprocess 3960979 00:25:02.452 04:25:50 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 3960979 ']' 00:25:02.453 04:25:50 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 3960979 00:25:02.453 04:25:50 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:25:02.453 04:25:50 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:02.453 04:25:50 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3960979 00:25:02.710 04:25:50 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:25:02.710 04:25:50 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:25:02.710 04:25:50 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3960979' 00:25:02.710 killing process with pid 3960979 00:25:02.710 04:25:50 compress_compdev -- common/autotest_common.sh@965 -- # kill 3960979 00:25:02.710 Received shutdown signal, test time was about 3.000000 seconds 00:25:02.710 00:25:02.710 Latency(us) 00:25:02.710 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:02.710 =================================================================================================================== 00:25:02.710 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:02.710 04:25:50 compress_compdev -- common/autotest_common.sh@970 -- # wait 3960979 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3962588 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:05.239 04:25:53 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3962588 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 3962588 ']' 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:05.239 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:05.239 04:25:53 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:05.239 [2024-05-15 04:25:53.076496] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:25:05.239 [2024-05-15 04:25:53.076583] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3962588 ] 00:25:05.239 [2024-05-15 04:25:53.165099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:05.505 [2024-05-15 04:25:53.290148] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:05.505 [2024-05-15 04:25:53.290153] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:06.069 [2024-05-15 04:25:53.884800] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:06.069 04:25:54 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:06.069 04:25:54 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:25:06.069 04:25:54 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:25:06.069 04:25:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:06.069 04:25:54 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:09.344 [2024-05-15 04:25:57.137532] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20f3840 PMD being used: compress_qat 00:25:09.344 04:25:57 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:09.344 04:25:57 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:09.601 04:25:57 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:09.858 [ 00:25:09.858 { 00:25:09.858 "name": "Nvme0n1", 00:25:09.858 "aliases": [ 00:25:09.858 "9bbef83d-14ca-451b-a12f-4dfea5571df0" 00:25:09.858 ], 00:25:09.858 "product_name": "NVMe disk", 00:25:09.858 "block_size": 512, 00:25:09.858 "num_blocks": 3907029168, 00:25:09.858 "uuid": "9bbef83d-14ca-451b-a12f-4dfea5571df0", 00:25:09.858 "assigned_rate_limits": { 00:25:09.858 "rw_ios_per_sec": 0, 00:25:09.858 "rw_mbytes_per_sec": 0, 00:25:09.858 "r_mbytes_per_sec": 0, 00:25:09.858 "w_mbytes_per_sec": 0 00:25:09.858 }, 00:25:09.858 "claimed": false, 00:25:09.858 "zoned": false, 00:25:09.858 "supported_io_types": { 00:25:09.858 "read": true, 00:25:09.858 "write": true, 00:25:09.858 "unmap": true, 00:25:09.858 "write_zeroes": true, 00:25:09.858 "flush": true, 00:25:09.858 "reset": true, 00:25:09.858 "compare": false, 00:25:09.858 "compare_and_write": false, 00:25:09.858 "abort": true, 00:25:09.858 "nvme_admin": true, 00:25:09.858 "nvme_io": true 00:25:09.858 }, 00:25:09.858 "driver_specific": { 00:25:09.858 "nvme": [ 00:25:09.858 { 00:25:09.858 "pci_address": "0000:81:00.0", 00:25:09.858 "trid": { 00:25:09.858 "trtype": "PCIe", 00:25:09.858 "traddr": "0000:81:00.0" 00:25:09.858 }, 00:25:09.858 "ctrlr_data": { 00:25:09.858 "cntlid": 0, 00:25:09.858 "vendor_id": "0x8086", 00:25:09.858 "model_number": "INTEL SSDPE2KX020T8", 00:25:09.858 "serial_number": "PHLJ951302VM2P0BGN", 00:25:09.858 "firmware_revision": "VDV10184", 00:25:09.858 "oacs": { 00:25:09.858 "security": 0, 00:25:09.858 "format": 1, 00:25:09.858 "firmware": 1, 00:25:09.858 "ns_manage": 1 00:25:09.858 }, 00:25:09.858 "multi_ctrlr": false, 00:25:09.858 "ana_reporting": false 00:25:09.858 }, 00:25:09.858 "vs": { 00:25:09.858 "nvme_version": "1.2" 00:25:09.858 }, 00:25:09.858 "ns_data": { 00:25:09.858 "id": 1, 00:25:09.858 "can_share": false 00:25:09.858 } 00:25:09.858 } 00:25:09.858 ], 00:25:09.858 "mp_policy": "active_passive" 00:25:09.858 } 00:25:09.858 } 00:25:09.858 ] 00:25:09.858 04:25:57 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:09.858 04:25:57 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:10.115 [2024-05-15 04:25:57.935360] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f41c60 PMD being used: compress_qat 00:25:11.049 43da219c-da91-439c-ad89-fc0186e2dbbc 00:25:11.049 04:25:58 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:11.306 9fc3422d-f0a9-4657-b4fb-7ff9011faebb 00:25:11.306 04:25:59 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:11.306 04:25:59 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:11.564 04:25:59 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:11.822 [ 00:25:11.822 { 00:25:11.822 "name": "9fc3422d-f0a9-4657-b4fb-7ff9011faebb", 00:25:11.822 "aliases": [ 00:25:11.822 "lvs0/lv0" 00:25:11.822 ], 00:25:11.822 "product_name": "Logical Volume", 00:25:11.822 "block_size": 512, 00:25:11.822 "num_blocks": 204800, 00:25:11.822 "uuid": "9fc3422d-f0a9-4657-b4fb-7ff9011faebb", 00:25:11.822 "assigned_rate_limits": { 00:25:11.822 "rw_ios_per_sec": 0, 00:25:11.822 "rw_mbytes_per_sec": 0, 00:25:11.822 "r_mbytes_per_sec": 0, 00:25:11.822 "w_mbytes_per_sec": 0 00:25:11.822 }, 00:25:11.822 "claimed": false, 00:25:11.822 "zoned": false, 00:25:11.822 "supported_io_types": { 00:25:11.822 "read": true, 00:25:11.822 "write": true, 00:25:11.822 "unmap": true, 00:25:11.822 "write_zeroes": true, 00:25:11.822 "flush": false, 00:25:11.822 "reset": true, 00:25:11.822 "compare": false, 00:25:11.822 "compare_and_write": false, 00:25:11.822 "abort": false, 00:25:11.822 "nvme_admin": false, 00:25:11.822 "nvme_io": false 00:25:11.822 }, 00:25:11.822 "driver_specific": { 00:25:11.822 "lvol": { 00:25:11.822 "lvol_store_uuid": "43da219c-da91-439c-ad89-fc0186e2dbbc", 00:25:11.822 "base_bdev": "Nvme0n1", 00:25:11.822 "thin_provision": true, 00:25:11.822 "num_allocated_clusters": 0, 00:25:11.822 "snapshot": false, 00:25:11.822 "clone": false, 00:25:11.822 "esnap_clone": false 00:25:11.822 } 00:25:11.822 } 00:25:11.822 } 00:25:11.822 ] 00:25:11.822 04:25:59 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:11.822 04:25:59 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:25:11.822 04:25:59 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:25:12.080 [2024-05-15 04:25:59.907208] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:12.080 COMP_lvs0/lv0 00:25:12.080 04:25:59 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:12.080 04:25:59 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:12.337 04:26:00 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:12.595 [ 00:25:12.595 { 00:25:12.595 "name": "COMP_lvs0/lv0", 00:25:12.595 "aliases": [ 00:25:12.595 "72c0645d-cbee-511a-947f-0a5e17dc10ce" 00:25:12.595 ], 00:25:12.595 "product_name": "compress", 00:25:12.595 "block_size": 512, 00:25:12.595 "num_blocks": 200704, 00:25:12.595 "uuid": "72c0645d-cbee-511a-947f-0a5e17dc10ce", 00:25:12.595 "assigned_rate_limits": { 00:25:12.595 "rw_ios_per_sec": 0, 00:25:12.595 "rw_mbytes_per_sec": 0, 00:25:12.595 "r_mbytes_per_sec": 0, 00:25:12.595 "w_mbytes_per_sec": 0 00:25:12.595 }, 00:25:12.595 "claimed": false, 00:25:12.595 "zoned": false, 00:25:12.595 "supported_io_types": { 00:25:12.595 "read": true, 00:25:12.595 "write": true, 00:25:12.595 "unmap": false, 00:25:12.595 "write_zeroes": true, 00:25:12.595 "flush": false, 00:25:12.595 "reset": false, 00:25:12.595 "compare": false, 00:25:12.595 "compare_and_write": false, 00:25:12.595 "abort": false, 00:25:12.595 "nvme_admin": false, 00:25:12.595 "nvme_io": false 00:25:12.595 }, 00:25:12.595 "driver_specific": { 00:25:12.595 "compress": { 00:25:12.595 "name": "COMP_lvs0/lv0", 00:25:12.595 "base_bdev_name": "9fc3422d-f0a9-4657-b4fb-7ff9011faebb" 00:25:12.595 } 00:25:12.595 } 00:25:12.595 } 00:25:12.595 ] 00:25:12.595 04:26:00 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:12.595 04:26:00 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:12.595 [2024-05-15 04:26:00.529741] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8ecc1b15a0 PMD being used: compress_qat 00:25:12.595 [2024-05-15 04:26:00.531707] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20e7a90 PMD being used: compress_qat 00:25:12.595 Running I/O for 3 seconds... 00:25:15.874 00:25:15.874 Latency(us) 00:25:15.874 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:15.874 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:15.874 Verification LBA range: start 0x0 length 0x3100 00:25:15.874 COMP_lvs0/lv0 : 3.01 3498.63 13.67 0.00 0.00 9097.20 141.84 14078.10 00:25:15.874 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:15.874 Verification LBA range: start 0x3100 length 0x3100 00:25:15.874 COMP_lvs0/lv0 : 3.01 3605.94 14.09 0.00 0.00 8828.43 136.53 13301.38 00:25:15.874 =================================================================================================================== 00:25:15.874 Total : 7104.57 27.75 0.00 0.00 8960.85 136.53 14078.10 00:25:15.874 0 00:25:15.874 04:26:03 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:25:15.874 04:26:03 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:15.874 04:26:03 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:16.132 04:26:04 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:16.132 04:26:04 compress_compdev -- compress/compress.sh@78 -- # killprocess 3962588 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 3962588 ']' 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 3962588 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3962588 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3962588' 00:25:16.132 killing process with pid 3962588 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@965 -- # kill 3962588 00:25:16.132 Received shutdown signal, test time was about 3.000000 seconds 00:25:16.132 00:25:16.132 Latency(us) 00:25:16.132 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:16.132 =================================================================================================================== 00:25:16.132 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:16.132 04:26:04 compress_compdev -- common/autotest_common.sh@970 -- # wait 3962588 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=3964196 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:18.659 04:26:06 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 3964196 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 3964196 ']' 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:18.659 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:18.659 04:26:06 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:18.917 [2024-05-15 04:26:06.712193] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:25:18.917 [2024-05-15 04:26:06.712264] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3964196 ] 00:25:18.917 [2024-05-15 04:26:06.789757] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:18.917 [2024-05-15 04:26:06.897223] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:18.917 [2024-05-15 04:26:06.897227] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:19.481 [2024-05-15 04:26:07.476285] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:19.739 04:26:07 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:19.739 04:26:07 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:25:19.739 04:26:07 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:25:19.739 04:26:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:19.739 04:26:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:23.033 [2024-05-15 04:26:10.769869] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28e4840 PMD being used: compress_qat 00:25:23.033 04:26:10 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:23.033 04:26:10 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:23.290 04:26:11 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:23.291 [ 00:25:23.291 { 00:25:23.291 "name": "Nvme0n1", 00:25:23.291 "aliases": [ 00:25:23.291 "96c46716-7084-4296-94c7-767a3b6fc948" 00:25:23.291 ], 00:25:23.291 "product_name": "NVMe disk", 00:25:23.291 "block_size": 512, 00:25:23.291 "num_blocks": 3907029168, 00:25:23.291 "uuid": "96c46716-7084-4296-94c7-767a3b6fc948", 00:25:23.291 "assigned_rate_limits": { 00:25:23.291 "rw_ios_per_sec": 0, 00:25:23.291 "rw_mbytes_per_sec": 0, 00:25:23.291 "r_mbytes_per_sec": 0, 00:25:23.291 "w_mbytes_per_sec": 0 00:25:23.291 }, 00:25:23.291 "claimed": false, 00:25:23.291 "zoned": false, 00:25:23.291 "supported_io_types": { 00:25:23.291 "read": true, 00:25:23.291 "write": true, 00:25:23.291 "unmap": true, 00:25:23.291 "write_zeroes": true, 00:25:23.291 "flush": true, 00:25:23.291 "reset": true, 00:25:23.291 "compare": false, 00:25:23.291 "compare_and_write": false, 00:25:23.291 "abort": true, 00:25:23.291 "nvme_admin": true, 00:25:23.291 "nvme_io": true 00:25:23.291 }, 00:25:23.291 "driver_specific": { 00:25:23.291 "nvme": [ 00:25:23.291 { 00:25:23.291 "pci_address": "0000:81:00.0", 00:25:23.291 "trid": { 00:25:23.291 "trtype": "PCIe", 00:25:23.291 "traddr": "0000:81:00.0" 00:25:23.291 }, 00:25:23.291 "ctrlr_data": { 00:25:23.291 "cntlid": 0, 00:25:23.291 "vendor_id": "0x8086", 00:25:23.291 "model_number": "INTEL SSDPE2KX020T8", 00:25:23.291 "serial_number": "PHLJ951302VM2P0BGN", 00:25:23.291 "firmware_revision": "VDV10184", 00:25:23.291 "oacs": { 00:25:23.291 "security": 0, 00:25:23.291 "format": 1, 00:25:23.291 "firmware": 1, 00:25:23.291 "ns_manage": 1 00:25:23.291 }, 00:25:23.291 "multi_ctrlr": false, 00:25:23.291 "ana_reporting": false 00:25:23.291 }, 00:25:23.291 "vs": { 00:25:23.291 "nvme_version": "1.2" 00:25:23.291 }, 00:25:23.291 "ns_data": { 00:25:23.291 "id": 1, 00:25:23.291 "can_share": false 00:25:23.291 } 00:25:23.291 } 00:25:23.291 ], 00:25:23.291 "mp_policy": "active_passive" 00:25:23.291 } 00:25:23.291 } 00:25:23.291 ] 00:25:23.548 04:26:11 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:23.548 04:26:11 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:23.548 [2024-05-15 04:26:11.535248] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2732c60 PMD being used: compress_qat 00:25:24.922 4f06370d-1e2a-49c2-b563-723346bb3e32 00:25:24.922 04:26:12 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:24.922 d9ebe76e-c3ed-414f-ade9-d8152e3eddcc 00:25:24.922 04:26:12 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:24.922 04:26:12 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:25.180 04:26:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:25.438 [ 00:25:25.438 { 00:25:25.438 "name": "d9ebe76e-c3ed-414f-ade9-d8152e3eddcc", 00:25:25.438 "aliases": [ 00:25:25.438 "lvs0/lv0" 00:25:25.438 ], 00:25:25.438 "product_name": "Logical Volume", 00:25:25.438 "block_size": 512, 00:25:25.438 "num_blocks": 204800, 00:25:25.438 "uuid": "d9ebe76e-c3ed-414f-ade9-d8152e3eddcc", 00:25:25.438 "assigned_rate_limits": { 00:25:25.438 "rw_ios_per_sec": 0, 00:25:25.438 "rw_mbytes_per_sec": 0, 00:25:25.439 "r_mbytes_per_sec": 0, 00:25:25.439 "w_mbytes_per_sec": 0 00:25:25.439 }, 00:25:25.439 "claimed": false, 00:25:25.439 "zoned": false, 00:25:25.439 "supported_io_types": { 00:25:25.439 "read": true, 00:25:25.439 "write": true, 00:25:25.439 "unmap": true, 00:25:25.439 "write_zeroes": true, 00:25:25.439 "flush": false, 00:25:25.439 "reset": true, 00:25:25.439 "compare": false, 00:25:25.439 "compare_and_write": false, 00:25:25.439 "abort": false, 00:25:25.439 "nvme_admin": false, 00:25:25.439 "nvme_io": false 00:25:25.439 }, 00:25:25.439 "driver_specific": { 00:25:25.439 "lvol": { 00:25:25.439 "lvol_store_uuid": "4f06370d-1e2a-49c2-b563-723346bb3e32", 00:25:25.439 "base_bdev": "Nvme0n1", 00:25:25.439 "thin_provision": true, 00:25:25.439 "num_allocated_clusters": 0, 00:25:25.439 "snapshot": false, 00:25:25.439 "clone": false, 00:25:25.439 "esnap_clone": false 00:25:25.439 } 00:25:25.439 } 00:25:25.439 } 00:25:25.439 ] 00:25:25.695 04:26:13 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:25.695 04:26:13 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:25:25.695 04:26:13 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:25:25.695 [2024-05-15 04:26:13.696857] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:25.695 COMP_lvs0/lv0 00:25:25.952 04:26:13 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:25.952 04:26:13 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:26.517 [ 00:25:26.517 { 00:25:26.517 "name": "COMP_lvs0/lv0", 00:25:26.517 "aliases": [ 00:25:26.517 "90eb3f7d-8b96-5561-82f2-dff71cef355c" 00:25:26.517 ], 00:25:26.517 "product_name": "compress", 00:25:26.517 "block_size": 4096, 00:25:26.517 "num_blocks": 25088, 00:25:26.517 "uuid": "90eb3f7d-8b96-5561-82f2-dff71cef355c", 00:25:26.517 "assigned_rate_limits": { 00:25:26.517 "rw_ios_per_sec": 0, 00:25:26.517 "rw_mbytes_per_sec": 0, 00:25:26.517 "r_mbytes_per_sec": 0, 00:25:26.517 "w_mbytes_per_sec": 0 00:25:26.517 }, 00:25:26.517 "claimed": false, 00:25:26.517 "zoned": false, 00:25:26.518 "supported_io_types": { 00:25:26.518 "read": true, 00:25:26.518 "write": true, 00:25:26.518 "unmap": false, 00:25:26.518 "write_zeroes": true, 00:25:26.518 "flush": false, 00:25:26.518 "reset": false, 00:25:26.518 "compare": false, 00:25:26.518 "compare_and_write": false, 00:25:26.518 "abort": false, 00:25:26.518 "nvme_admin": false, 00:25:26.518 "nvme_io": false 00:25:26.518 }, 00:25:26.518 "driver_specific": { 00:25:26.518 "compress": { 00:25:26.518 "name": "COMP_lvs0/lv0", 00:25:26.518 "base_bdev_name": "d9ebe76e-c3ed-414f-ade9-d8152e3eddcc" 00:25:26.518 } 00:25:26.518 } 00:25:26.518 } 00:25:26.518 ] 00:25:26.518 04:26:14 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:26.518 04:26:14 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:26.518 [2024-05-15 04:26:14.339679] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f61941b15a0 PMD being used: compress_qat 00:25:26.518 [2024-05-15 04:26:14.341599] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x28d8960 PMD being used: compress_qat 00:25:26.518 Running I/O for 3 seconds... 00:25:29.800 00:25:29.800 Latency(us) 00:25:29.800 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.800 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:29.800 Verification LBA range: start 0x0 length 0x3100 00:25:29.800 COMP_lvs0/lv0 : 3.01 3329.21 13.00 0.00 0.00 9558.47 190.39 16019.91 00:25:29.800 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:29.800 Verification LBA range: start 0x3100 length 0x3100 00:25:29.800 COMP_lvs0/lv0 : 3.01 3410.63 13.32 0.00 0.00 9335.38 185.08 16117.00 00:25:29.800 =================================================================================================================== 00:25:29.800 Total : 6739.84 26.33 0.00 0.00 9445.61 185.08 16117.00 00:25:29.800 0 00:25:29.800 04:26:17 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:25:29.800 04:26:17 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:29.800 04:26:17 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:30.058 04:26:17 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:30.058 04:26:17 compress_compdev -- compress/compress.sh@78 -- # killprocess 3964196 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 3964196 ']' 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 3964196 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3964196 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3964196' 00:25:30.058 killing process with pid 3964196 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@965 -- # kill 3964196 00:25:30.058 Received shutdown signal, test time was about 3.000000 seconds 00:25:30.058 00:25:30.058 Latency(us) 00:25:30.058 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:30.058 =================================================================================================================== 00:25:30.058 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:30.058 04:26:17 compress_compdev -- common/autotest_common.sh@970 -- # wait 3964196 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=3965809 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:32.588 04:26:20 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 3965809 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@827 -- # '[' -z 3965809 ']' 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:32.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:32.588 04:26:20 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:32.588 [2024-05-15 04:26:20.522396] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:25:32.588 [2024-05-15 04:26:20.522477] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3965809 ] 00:25:32.846 [2024-05-15 04:26:20.604883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:25:32.846 [2024-05-15 04:26:20.723437] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:32.846 [2024-05-15 04:26:20.723504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:32.846 [2024-05-15 04:26:20.723506] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:25:33.412 [2024-05-15 04:26:21.325037] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:25:33.412 04:26:21 compress_compdev -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:33.412 04:26:21 compress_compdev -- common/autotest_common.sh@860 -- # return 0 00:25:33.412 04:26:21 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:25:33.412 04:26:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:33.412 04:26:21 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:36.762 [2024-05-15 04:26:24.490411] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f682e0 PMD being used: compress_qat 00:25:36.762 04:26:24 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:36.762 04:26:24 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:37.018 04:26:24 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:37.275 [ 00:25:37.275 { 00:25:37.275 "name": "Nvme0n1", 00:25:37.275 "aliases": [ 00:25:37.275 "2db4eca1-49e4-4dcc-8261-7090798b46f0" 00:25:37.275 ], 00:25:37.275 "product_name": "NVMe disk", 00:25:37.275 "block_size": 512, 00:25:37.275 "num_blocks": 3907029168, 00:25:37.275 "uuid": "2db4eca1-49e4-4dcc-8261-7090798b46f0", 00:25:37.275 "assigned_rate_limits": { 00:25:37.275 "rw_ios_per_sec": 0, 00:25:37.275 "rw_mbytes_per_sec": 0, 00:25:37.275 "r_mbytes_per_sec": 0, 00:25:37.275 "w_mbytes_per_sec": 0 00:25:37.275 }, 00:25:37.275 "claimed": false, 00:25:37.275 "zoned": false, 00:25:37.275 "supported_io_types": { 00:25:37.275 "read": true, 00:25:37.275 "write": true, 00:25:37.275 "unmap": true, 00:25:37.275 "write_zeroes": true, 00:25:37.275 "flush": true, 00:25:37.275 "reset": true, 00:25:37.275 "compare": false, 00:25:37.275 "compare_and_write": false, 00:25:37.275 "abort": true, 00:25:37.275 "nvme_admin": true, 00:25:37.275 "nvme_io": true 00:25:37.275 }, 00:25:37.275 "driver_specific": { 00:25:37.275 "nvme": [ 00:25:37.275 { 00:25:37.275 "pci_address": "0000:81:00.0", 00:25:37.275 "trid": { 00:25:37.275 "trtype": "PCIe", 00:25:37.275 "traddr": "0000:81:00.0" 00:25:37.275 }, 00:25:37.275 "ctrlr_data": { 00:25:37.275 "cntlid": 0, 00:25:37.275 "vendor_id": "0x8086", 00:25:37.275 "model_number": "INTEL SSDPE2KX020T8", 00:25:37.275 "serial_number": "PHLJ951302VM2P0BGN", 00:25:37.275 "firmware_revision": "VDV10184", 00:25:37.275 "oacs": { 00:25:37.275 "security": 0, 00:25:37.275 "format": 1, 00:25:37.275 "firmware": 1, 00:25:37.275 "ns_manage": 1 00:25:37.275 }, 00:25:37.275 "multi_ctrlr": false, 00:25:37.275 "ana_reporting": false 00:25:37.275 }, 00:25:37.275 "vs": { 00:25:37.275 "nvme_version": "1.2" 00:25:37.275 }, 00:25:37.275 "ns_data": { 00:25:37.275 "id": 1, 00:25:37.275 "can_share": false 00:25:37.275 } 00:25:37.275 } 00:25:37.275 ], 00:25:37.275 "mp_policy": "active_passive" 00:25:37.275 } 00:25:37.275 } 00:25:37.275 ] 00:25:37.275 04:26:25 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:37.275 04:26:25 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:37.532 [2024-05-15 04:26:25.312156] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1db6710 PMD being used: compress_qat 00:25:38.463 e5623833-4201-4bc7-a75c-fe6cf0d3f02c 00:25:38.463 04:26:26 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:38.720 24e20aae-e02a-4ed4-87a5-3a8ae7cb2f31 00:25:38.720 04:26:26 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:38.720 04:26:26 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:38.977 04:26:26 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:38.977 [ 00:25:38.977 { 00:25:38.977 "name": "24e20aae-e02a-4ed4-87a5-3a8ae7cb2f31", 00:25:38.977 "aliases": [ 00:25:38.977 "lvs0/lv0" 00:25:38.977 ], 00:25:38.977 "product_name": "Logical Volume", 00:25:38.977 "block_size": 512, 00:25:38.977 "num_blocks": 204800, 00:25:38.977 "uuid": "24e20aae-e02a-4ed4-87a5-3a8ae7cb2f31", 00:25:38.977 "assigned_rate_limits": { 00:25:38.977 "rw_ios_per_sec": 0, 00:25:38.977 "rw_mbytes_per_sec": 0, 00:25:38.977 "r_mbytes_per_sec": 0, 00:25:38.977 "w_mbytes_per_sec": 0 00:25:38.977 }, 00:25:38.977 "claimed": false, 00:25:38.977 "zoned": false, 00:25:38.977 "supported_io_types": { 00:25:38.977 "read": true, 00:25:38.977 "write": true, 00:25:38.977 "unmap": true, 00:25:38.977 "write_zeroes": true, 00:25:38.977 "flush": false, 00:25:38.978 "reset": true, 00:25:38.978 "compare": false, 00:25:38.978 "compare_and_write": false, 00:25:38.978 "abort": false, 00:25:38.978 "nvme_admin": false, 00:25:38.978 "nvme_io": false 00:25:38.978 }, 00:25:38.978 "driver_specific": { 00:25:38.978 "lvol": { 00:25:38.978 "lvol_store_uuid": "e5623833-4201-4bc7-a75c-fe6cf0d3f02c", 00:25:38.978 "base_bdev": "Nvme0n1", 00:25:38.978 "thin_provision": true, 00:25:38.978 "num_allocated_clusters": 0, 00:25:38.978 "snapshot": false, 00:25:38.978 "clone": false, 00:25:38.978 "esnap_clone": false 00:25:38.978 } 00:25:38.978 } 00:25:38.978 } 00:25:38.978 ] 00:25:39.235 04:26:27 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:39.235 04:26:27 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:25:39.235 04:26:27 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:25:39.235 [2024-05-15 04:26:27.236276] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:39.235 COMP_lvs0/lv0 00:25:39.492 04:26:27 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@897 -- # local i 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:39.492 04:26:27 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:39.749 [ 00:25:39.749 { 00:25:39.749 "name": "COMP_lvs0/lv0", 00:25:39.749 "aliases": [ 00:25:39.749 "17e2e228-e487-511c-b83e-a53721014bc7" 00:25:39.749 ], 00:25:39.749 "product_name": "compress", 00:25:39.749 "block_size": 512, 00:25:39.749 "num_blocks": 200704, 00:25:39.749 "uuid": "17e2e228-e487-511c-b83e-a53721014bc7", 00:25:39.749 "assigned_rate_limits": { 00:25:39.749 "rw_ios_per_sec": 0, 00:25:39.749 "rw_mbytes_per_sec": 0, 00:25:39.749 "r_mbytes_per_sec": 0, 00:25:39.749 "w_mbytes_per_sec": 0 00:25:39.749 }, 00:25:39.749 "claimed": false, 00:25:39.749 "zoned": false, 00:25:39.749 "supported_io_types": { 00:25:39.749 "read": true, 00:25:39.749 "write": true, 00:25:39.749 "unmap": false, 00:25:39.749 "write_zeroes": true, 00:25:39.749 "flush": false, 00:25:39.749 "reset": false, 00:25:39.749 "compare": false, 00:25:39.749 "compare_and_write": false, 00:25:39.749 "abort": false, 00:25:39.749 "nvme_admin": false, 00:25:39.749 "nvme_io": false 00:25:39.749 }, 00:25:39.749 "driver_specific": { 00:25:39.749 "compress": { 00:25:39.749 "name": "COMP_lvs0/lv0", 00:25:39.749 "base_bdev_name": "24e20aae-e02a-4ed4-87a5-3a8ae7cb2f31" 00:25:39.749 } 00:25:39.749 } 00:25:39.749 } 00:25:39.749 ] 00:25:39.749 04:26:27 compress_compdev -- common/autotest_common.sh@903 -- # return 0 00:25:39.749 04:26:27 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:25:40.006 [2024-05-15 04:26:27.841518] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ffa081b1330 PMD being used: compress_qat 00:25:40.006 I/O targets: 00:25:40.006 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:25:40.006 00:25:40.006 00:25:40.006 CUnit - A unit testing framework for C - Version 2.1-3 00:25:40.006 http://cunit.sourceforge.net/ 00:25:40.006 00:25:40.006 00:25:40.006 Suite: bdevio tests on: COMP_lvs0/lv0 00:25:40.006 Test: blockdev write read block ...passed 00:25:40.006 Test: blockdev write zeroes read block ...passed 00:25:40.006 Test: blockdev write zeroes read no split ...passed 00:25:40.006 Test: blockdev write zeroes read split ...passed 00:25:40.006 Test: blockdev write zeroes read split partial ...passed 00:25:40.006 Test: blockdev reset ...[2024-05-15 04:26:27.895069] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:25:40.006 passed 00:25:40.006 Test: blockdev write read 8 blocks ...passed 00:25:40.006 Test: blockdev write read size > 128k ...passed 00:25:40.006 Test: blockdev write read invalid size ...passed 00:25:40.006 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:25:40.006 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:25:40.006 Test: blockdev write read max offset ...passed 00:25:40.006 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:25:40.006 Test: blockdev writev readv 8 blocks ...passed 00:25:40.006 Test: blockdev writev readv 30 x 1block ...passed 00:25:40.006 Test: blockdev writev readv block ...passed 00:25:40.006 Test: blockdev writev readv size > 128k ...passed 00:25:40.006 Test: blockdev writev readv size > 128k in two iovs ...passed 00:25:40.007 Test: blockdev comparev and writev ...passed 00:25:40.007 Test: blockdev nvme passthru rw ...passed 00:25:40.007 Test: blockdev nvme passthru vendor specific ...passed 00:25:40.007 Test: blockdev nvme admin passthru ...passed 00:25:40.007 Test: blockdev copy ...passed 00:25:40.007 00:25:40.007 Run Summary: Type Total Ran Passed Failed Inactive 00:25:40.007 suites 1 1 n/a 0 0 00:25:40.007 tests 23 23 23 0 0 00:25:40.007 asserts 130 130 130 0 n/a 00:25:40.007 00:25:40.007 Elapsed time = 0.183 seconds 00:25:40.007 0 00:25:40.007 04:26:27 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:25:40.007 04:26:27 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:40.264 04:26:28 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:40.521 04:26:28 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:25:40.521 04:26:28 compress_compdev -- compress/compress.sh@62 -- # killprocess 3965809 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@946 -- # '[' -z 3965809 ']' 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@950 -- # kill -0 3965809 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@951 -- # uname 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3965809 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3965809' 00:25:40.521 killing process with pid 3965809 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@965 -- # kill 3965809 00:25:40.521 04:26:28 compress_compdev -- common/autotest_common.sh@970 -- # wait 3965809 00:25:43.048 04:26:30 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:25:43.048 04:26:30 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:25:43.048 00:25:43.048 real 0m51.532s 00:25:43.048 user 1m57.877s 00:25:43.048 sys 0m4.686s 00:25:43.048 04:26:30 compress_compdev -- common/autotest_common.sh@1122 -- # xtrace_disable 00:25:43.048 04:26:30 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:25:43.048 ************************************ 00:25:43.048 END TEST compress_compdev 00:25:43.048 ************************************ 00:25:43.048 04:26:30 -- spdk/autotest.sh@345 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:25:43.048 04:26:30 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:25:43.048 04:26:30 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:25:43.048 04:26:30 -- common/autotest_common.sh@10 -- # set +x 00:25:43.048 ************************************ 00:25:43.048 START TEST compress_isal 00:25:43.048 ************************************ 00:25:43.048 04:26:30 compress_isal -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:25:43.048 * Looking for test storage... 00:25:43.048 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:25:43.048 04:26:31 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:25:43.048 04:26:31 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:25:43.048 04:26:31 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:25:43.048 04:26:31 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:25:43.048 04:26:31 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:25:43.048 04:26:31 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:43.048 04:26:31 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:43.048 04:26:31 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:43.048 04:26:31 compress_isal -- paths/export.sh@5 -- # export PATH 00:25:43.048 04:26:31 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@47 -- # : 0 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:25:43.049 04:26:31 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3967074 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:43.049 04:26:31 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3967074 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 3967074 ']' 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:43.049 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:43.049 04:26:31 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:43.307 [2024-05-15 04:26:31.076050] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:25:43.307 [2024-05-15 04:26:31.076145] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3967074 ] 00:25:43.307 [2024-05-15 04:26:31.150737] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:43.307 [2024-05-15 04:26:31.262632] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:43.307 [2024-05-15 04:26:31.262634] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:44.240 04:26:32 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:44.240 04:26:32 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:25:44.240 04:26:32 compress_isal -- compress/compress.sh@74 -- # create_vols 00:25:44.240 04:26:32 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:44.240 04:26:32 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:25:47.520 04:26:35 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@897 -- # local i 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:47.520 04:26:35 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:25:47.777 [ 00:25:47.777 { 00:25:47.777 "name": "Nvme0n1", 00:25:47.777 "aliases": [ 00:25:47.777 "727da103-08f5-4209-94f7-891f477a90fd" 00:25:47.777 ], 00:25:47.777 "product_name": "NVMe disk", 00:25:47.777 "block_size": 512, 00:25:47.777 "num_blocks": 3907029168, 00:25:47.777 "uuid": "727da103-08f5-4209-94f7-891f477a90fd", 00:25:47.777 "assigned_rate_limits": { 00:25:47.777 "rw_ios_per_sec": 0, 00:25:47.777 "rw_mbytes_per_sec": 0, 00:25:47.777 "r_mbytes_per_sec": 0, 00:25:47.777 "w_mbytes_per_sec": 0 00:25:47.777 }, 00:25:47.777 "claimed": false, 00:25:47.777 "zoned": false, 00:25:47.777 "supported_io_types": { 00:25:47.777 "read": true, 00:25:47.777 "write": true, 00:25:47.777 "unmap": true, 00:25:47.777 "write_zeroes": true, 00:25:47.777 "flush": true, 00:25:47.777 "reset": true, 00:25:47.777 "compare": false, 00:25:47.777 "compare_and_write": false, 00:25:47.777 "abort": true, 00:25:47.777 "nvme_admin": true, 00:25:47.777 "nvme_io": true 00:25:47.777 }, 00:25:47.777 "driver_specific": { 00:25:47.777 "nvme": [ 00:25:47.777 { 00:25:47.777 "pci_address": "0000:81:00.0", 00:25:47.777 "trid": { 00:25:47.777 "trtype": "PCIe", 00:25:47.777 "traddr": "0000:81:00.0" 00:25:47.777 }, 00:25:47.777 "ctrlr_data": { 00:25:47.777 "cntlid": 0, 00:25:47.777 "vendor_id": "0x8086", 00:25:47.777 "model_number": "INTEL SSDPE2KX020T8", 00:25:47.777 "serial_number": "PHLJ951302VM2P0BGN", 00:25:47.777 "firmware_revision": "VDV10184", 00:25:47.777 "oacs": { 00:25:47.777 "security": 0, 00:25:47.777 "format": 1, 00:25:47.777 "firmware": 1, 00:25:47.777 "ns_manage": 1 00:25:47.777 }, 00:25:47.777 "multi_ctrlr": false, 00:25:47.777 "ana_reporting": false 00:25:47.777 }, 00:25:47.777 "vs": { 00:25:47.777 "nvme_version": "1.2" 00:25:47.777 }, 00:25:47.777 "ns_data": { 00:25:47.777 "id": 1, 00:25:47.777 "can_share": false 00:25:47.777 } 00:25:47.777 } 00:25:47.777 ], 00:25:47.777 "mp_policy": "active_passive" 00:25:47.777 } 00:25:47.777 } 00:25:47.778 ] 00:25:47.778 04:26:35 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:25:47.778 04:26:35 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:25:49.149 6581cc3c-177b-4bab-a7cc-e2b7d1517abf 00:25:49.149 04:26:36 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:25:49.149 5f84092f-32d0-4e12-ba83-ec2183e16c45 00:25:49.149 04:26:37 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@897 -- # local i 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:49.149 04:26:37 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:49.406 04:26:37 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:25:49.664 [ 00:25:49.664 { 00:25:49.664 "name": "5f84092f-32d0-4e12-ba83-ec2183e16c45", 00:25:49.664 "aliases": [ 00:25:49.664 "lvs0/lv0" 00:25:49.664 ], 00:25:49.664 "product_name": "Logical Volume", 00:25:49.664 "block_size": 512, 00:25:49.664 "num_blocks": 204800, 00:25:49.664 "uuid": "5f84092f-32d0-4e12-ba83-ec2183e16c45", 00:25:49.664 "assigned_rate_limits": { 00:25:49.664 "rw_ios_per_sec": 0, 00:25:49.664 "rw_mbytes_per_sec": 0, 00:25:49.664 "r_mbytes_per_sec": 0, 00:25:49.664 "w_mbytes_per_sec": 0 00:25:49.664 }, 00:25:49.664 "claimed": false, 00:25:49.664 "zoned": false, 00:25:49.664 "supported_io_types": { 00:25:49.664 "read": true, 00:25:49.664 "write": true, 00:25:49.664 "unmap": true, 00:25:49.664 "write_zeroes": true, 00:25:49.664 "flush": false, 00:25:49.664 "reset": true, 00:25:49.664 "compare": false, 00:25:49.664 "compare_and_write": false, 00:25:49.664 "abort": false, 00:25:49.664 "nvme_admin": false, 00:25:49.664 "nvme_io": false 00:25:49.664 }, 00:25:49.664 "driver_specific": { 00:25:49.664 "lvol": { 00:25:49.664 "lvol_store_uuid": "6581cc3c-177b-4bab-a7cc-e2b7d1517abf", 00:25:49.664 "base_bdev": "Nvme0n1", 00:25:49.664 "thin_provision": true, 00:25:49.664 "num_allocated_clusters": 0, 00:25:49.664 "snapshot": false, 00:25:49.664 "clone": false, 00:25:49.664 "esnap_clone": false 00:25:49.664 } 00:25:49.664 } 00:25:49.664 } 00:25:49.664 ] 00:25:49.664 04:26:37 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:25:49.664 04:26:37 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:25:49.664 04:26:37 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:25:49.922 [2024-05-15 04:26:37.886606] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:25:49.922 COMP_lvs0/lv0 00:25:49.922 04:26:37 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@897 -- # local i 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:25:49.922 04:26:37 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:25:50.179 04:26:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:25:50.436 [ 00:25:50.436 { 00:25:50.436 "name": "COMP_lvs0/lv0", 00:25:50.436 "aliases": [ 00:25:50.436 "16686d68-7862-56b6-8913-3d882b5f6c86" 00:25:50.436 ], 00:25:50.436 "product_name": "compress", 00:25:50.436 "block_size": 512, 00:25:50.437 "num_blocks": 200704, 00:25:50.437 "uuid": "16686d68-7862-56b6-8913-3d882b5f6c86", 00:25:50.437 "assigned_rate_limits": { 00:25:50.437 "rw_ios_per_sec": 0, 00:25:50.437 "rw_mbytes_per_sec": 0, 00:25:50.437 "r_mbytes_per_sec": 0, 00:25:50.437 "w_mbytes_per_sec": 0 00:25:50.437 }, 00:25:50.437 "claimed": false, 00:25:50.437 "zoned": false, 00:25:50.437 "supported_io_types": { 00:25:50.437 "read": true, 00:25:50.437 "write": true, 00:25:50.437 "unmap": false, 00:25:50.437 "write_zeroes": true, 00:25:50.437 "flush": false, 00:25:50.437 "reset": false, 00:25:50.437 "compare": false, 00:25:50.437 "compare_and_write": false, 00:25:50.437 "abort": false, 00:25:50.437 "nvme_admin": false, 00:25:50.437 "nvme_io": false 00:25:50.437 }, 00:25:50.437 "driver_specific": { 00:25:50.437 "compress": { 00:25:50.437 "name": "COMP_lvs0/lv0", 00:25:50.437 "base_bdev_name": "5f84092f-32d0-4e12-ba83-ec2183e16c45" 00:25:50.437 } 00:25:50.437 } 00:25:50.437 } 00:25:50.437 ] 00:25:50.437 04:26:38 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:25:50.437 04:26:38 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:25:50.694 Running I/O for 3 seconds... 00:25:53.973 00:25:53.973 Latency(us) 00:25:53.973 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:53.973 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:25:53.973 Verification LBA range: start 0x0 length 0x3100 00:25:53.973 COMP_lvs0/lv0 : 3.01 2713.69 10.60 0.00 0.00 11734.04 82.68 19515.16 00:25:53.973 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:25:53.973 Verification LBA range: start 0x3100 length 0x3100 00:25:53.973 COMP_lvs0/lv0 : 3.01 2678.98 10.46 0.00 0.00 11890.40 80.40 19223.89 00:25:53.973 =================================================================================================================== 00:25:53.973 Total : 5392.67 21.07 0.00 0.00 11811.71 80.40 19515.16 00:25:53.973 0 00:25:53.973 04:26:41 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:25:53.973 04:26:41 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:25:53.973 04:26:41 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:25:54.231 04:26:42 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:25:54.231 04:26:42 compress_isal -- compress/compress.sh@78 -- # killprocess 3967074 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 3967074 ']' 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@950 -- # kill -0 3967074 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@951 -- # uname 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3967074 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3967074' 00:25:54.231 killing process with pid 3967074 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@965 -- # kill 3967074 00:25:54.231 Received shutdown signal, test time was about 3.000000 seconds 00:25:54.231 00:25:54.231 Latency(us) 00:25:54.231 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:54.231 =================================================================================================================== 00:25:54.231 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:54.231 04:26:42 compress_isal -- common/autotest_common.sh@970 -- # wait 3967074 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3968686 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:25:56.757 04:26:44 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3968686 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 3968686 ']' 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:25:56.757 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:25:56.757 04:26:44 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:25:56.757 [2024-05-15 04:26:44.648498] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:25:56.757 [2024-05-15 04:26:44.648588] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3968686 ] 00:25:56.757 [2024-05-15 04:26:44.730470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:25:57.015 [2024-05-15 04:26:44.847814] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:25:57.015 [2024-05-15 04:26:44.847817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:25:57.579 04:26:45 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:25:57.579 04:26:45 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:25:57.579 04:26:45 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:25:57.579 04:26:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:25:57.579 04:26:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:00.857 04:26:48 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:00.857 04:26:48 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:01.114 04:26:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:01.371 [ 00:26:01.371 { 00:26:01.371 "name": "Nvme0n1", 00:26:01.371 "aliases": [ 00:26:01.371 "7df3b79f-ffca-40e1-ac27-718698c00ca1" 00:26:01.371 ], 00:26:01.371 "product_name": "NVMe disk", 00:26:01.371 "block_size": 512, 00:26:01.371 "num_blocks": 3907029168, 00:26:01.371 "uuid": "7df3b79f-ffca-40e1-ac27-718698c00ca1", 00:26:01.371 "assigned_rate_limits": { 00:26:01.372 "rw_ios_per_sec": 0, 00:26:01.372 "rw_mbytes_per_sec": 0, 00:26:01.372 "r_mbytes_per_sec": 0, 00:26:01.372 "w_mbytes_per_sec": 0 00:26:01.372 }, 00:26:01.372 "claimed": false, 00:26:01.372 "zoned": false, 00:26:01.372 "supported_io_types": { 00:26:01.372 "read": true, 00:26:01.372 "write": true, 00:26:01.372 "unmap": true, 00:26:01.372 "write_zeroes": true, 00:26:01.372 "flush": true, 00:26:01.372 "reset": true, 00:26:01.372 "compare": false, 00:26:01.372 "compare_and_write": false, 00:26:01.372 "abort": true, 00:26:01.372 "nvme_admin": true, 00:26:01.372 "nvme_io": true 00:26:01.372 }, 00:26:01.372 "driver_specific": { 00:26:01.372 "nvme": [ 00:26:01.372 { 00:26:01.372 "pci_address": "0000:81:00.0", 00:26:01.372 "trid": { 00:26:01.372 "trtype": "PCIe", 00:26:01.372 "traddr": "0000:81:00.0" 00:26:01.372 }, 00:26:01.372 "ctrlr_data": { 00:26:01.372 "cntlid": 0, 00:26:01.372 "vendor_id": "0x8086", 00:26:01.372 "model_number": "INTEL SSDPE2KX020T8", 00:26:01.372 "serial_number": "PHLJ951302VM2P0BGN", 00:26:01.372 "firmware_revision": "VDV10184", 00:26:01.372 "oacs": { 00:26:01.372 "security": 0, 00:26:01.372 "format": 1, 00:26:01.372 "firmware": 1, 00:26:01.372 "ns_manage": 1 00:26:01.372 }, 00:26:01.372 "multi_ctrlr": false, 00:26:01.372 "ana_reporting": false 00:26:01.372 }, 00:26:01.372 "vs": { 00:26:01.372 "nvme_version": "1.2" 00:26:01.372 }, 00:26:01.372 "ns_data": { 00:26:01.372 "id": 1, 00:26:01.372 "can_share": false 00:26:01.372 } 00:26:01.372 } 00:26:01.372 ], 00:26:01.372 "mp_policy": "active_passive" 00:26:01.372 } 00:26:01.372 } 00:26:01.372 ] 00:26:01.372 04:26:49 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:01.372 04:26:49 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:02.744 01ab9ae1-c6ea-4fcf-952d-1651ffa8cb5c 00:26:02.744 04:26:50 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:02.744 e749f22e-a6e8-4db5-a9d8-d35679297864 00:26:02.744 04:26:50 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:02.744 04:26:50 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:03.002 04:26:50 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:03.259 [ 00:26:03.259 { 00:26:03.259 "name": "e749f22e-a6e8-4db5-a9d8-d35679297864", 00:26:03.259 "aliases": [ 00:26:03.259 "lvs0/lv0" 00:26:03.259 ], 00:26:03.259 "product_name": "Logical Volume", 00:26:03.259 "block_size": 512, 00:26:03.259 "num_blocks": 204800, 00:26:03.259 "uuid": "e749f22e-a6e8-4db5-a9d8-d35679297864", 00:26:03.259 "assigned_rate_limits": { 00:26:03.259 "rw_ios_per_sec": 0, 00:26:03.259 "rw_mbytes_per_sec": 0, 00:26:03.259 "r_mbytes_per_sec": 0, 00:26:03.259 "w_mbytes_per_sec": 0 00:26:03.259 }, 00:26:03.259 "claimed": false, 00:26:03.259 "zoned": false, 00:26:03.259 "supported_io_types": { 00:26:03.259 "read": true, 00:26:03.259 "write": true, 00:26:03.259 "unmap": true, 00:26:03.259 "write_zeroes": true, 00:26:03.259 "flush": false, 00:26:03.259 "reset": true, 00:26:03.259 "compare": false, 00:26:03.259 "compare_and_write": false, 00:26:03.259 "abort": false, 00:26:03.259 "nvme_admin": false, 00:26:03.259 "nvme_io": false 00:26:03.259 }, 00:26:03.259 "driver_specific": { 00:26:03.259 "lvol": { 00:26:03.259 "lvol_store_uuid": "01ab9ae1-c6ea-4fcf-952d-1651ffa8cb5c", 00:26:03.259 "base_bdev": "Nvme0n1", 00:26:03.259 "thin_provision": true, 00:26:03.259 "num_allocated_clusters": 0, 00:26:03.259 "snapshot": false, 00:26:03.259 "clone": false, 00:26:03.259 "esnap_clone": false 00:26:03.259 } 00:26:03.259 } 00:26:03.259 } 00:26:03.259 ] 00:26:03.259 04:26:51 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:03.259 04:26:51 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:26:03.259 04:26:51 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:26:03.516 [2024-05-15 04:26:51.433314] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:03.516 COMP_lvs0/lv0 00:26:03.516 04:26:51 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:03.516 04:26:51 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:03.772 04:26:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:04.029 [ 00:26:04.029 { 00:26:04.030 "name": "COMP_lvs0/lv0", 00:26:04.030 "aliases": [ 00:26:04.030 "bc99c8b3-a17e-5cf9-9c2d-939ce029178f" 00:26:04.030 ], 00:26:04.030 "product_name": "compress", 00:26:04.030 "block_size": 512, 00:26:04.030 "num_blocks": 200704, 00:26:04.030 "uuid": "bc99c8b3-a17e-5cf9-9c2d-939ce029178f", 00:26:04.030 "assigned_rate_limits": { 00:26:04.030 "rw_ios_per_sec": 0, 00:26:04.030 "rw_mbytes_per_sec": 0, 00:26:04.030 "r_mbytes_per_sec": 0, 00:26:04.030 "w_mbytes_per_sec": 0 00:26:04.030 }, 00:26:04.030 "claimed": false, 00:26:04.030 "zoned": false, 00:26:04.030 "supported_io_types": { 00:26:04.030 "read": true, 00:26:04.030 "write": true, 00:26:04.030 "unmap": false, 00:26:04.030 "write_zeroes": true, 00:26:04.030 "flush": false, 00:26:04.030 "reset": false, 00:26:04.030 "compare": false, 00:26:04.030 "compare_and_write": false, 00:26:04.030 "abort": false, 00:26:04.030 "nvme_admin": false, 00:26:04.030 "nvme_io": false 00:26:04.030 }, 00:26:04.030 "driver_specific": { 00:26:04.030 "compress": { 00:26:04.030 "name": "COMP_lvs0/lv0", 00:26:04.030 "base_bdev_name": "e749f22e-a6e8-4db5-a9d8-d35679297864" 00:26:04.030 } 00:26:04.030 } 00:26:04.030 } 00:26:04.030 ] 00:26:04.030 04:26:51 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:04.030 04:26:51 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:04.030 Running I/O for 3 seconds... 00:26:07.332 00:26:07.332 Latency(us) 00:26:07.332 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.332 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:07.332 Verification LBA range: start 0x0 length 0x3100 00:26:07.332 COMP_lvs0/lv0 : 3.01 2763.02 10.79 0.00 0.00 11531.04 81.16 16505.36 00:26:07.332 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:07.332 Verification LBA range: start 0x3100 length 0x3100 00:26:07.332 COMP_lvs0/lv0 : 3.01 2813.52 10.99 0.00 0.00 11320.91 82.30 16505.36 00:26:07.332 =================================================================================================================== 00:26:07.332 Total : 5576.53 21.78 0.00 0.00 11425.02 81.16 16505.36 00:26:07.332 0 00:26:07.332 04:26:55 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:26:07.332 04:26:55 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:07.332 04:26:55 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:07.589 04:26:55 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:07.589 04:26:55 compress_isal -- compress/compress.sh@78 -- # killprocess 3968686 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 3968686 ']' 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@950 -- # kill -0 3968686 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@951 -- # uname 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3968686 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3968686' 00:26:07.589 killing process with pid 3968686 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@965 -- # kill 3968686 00:26:07.589 Received shutdown signal, test time was about 3.000000 seconds 00:26:07.589 00:26:07.589 Latency(us) 00:26:07.589 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.589 =================================================================================================================== 00:26:07.589 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:07.589 04:26:55 compress_isal -- common/autotest_common.sh@970 -- # wait 3968686 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=3970289 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:10.865 04:26:58 compress_isal -- compress/compress.sh@73 -- # waitforlisten 3970289 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 3970289 ']' 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:10.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:10.865 04:26:58 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:10.865 [2024-05-15 04:26:58.194298] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:10.865 [2024-05-15 04:26:58.194383] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3970289 ] 00:26:10.866 [2024-05-15 04:26:58.276446] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:26:10.866 [2024-05-15 04:26:58.394368] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:10.866 [2024-05-15 04:26:58.394372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:11.429 04:26:59 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:11.429 04:26:59 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:26:11.429 04:26:59 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:26:11.429 04:26:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:11.429 04:26:59 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:14.704 04:27:02 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:14.704 04:27:02 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:14.961 [ 00:26:14.961 { 00:26:14.961 "name": "Nvme0n1", 00:26:14.961 "aliases": [ 00:26:14.961 "a947ec57-9295-4443-8e04-e1b3de1368d9" 00:26:14.961 ], 00:26:14.961 "product_name": "NVMe disk", 00:26:14.961 "block_size": 512, 00:26:14.961 "num_blocks": 3907029168, 00:26:14.961 "uuid": "a947ec57-9295-4443-8e04-e1b3de1368d9", 00:26:14.961 "assigned_rate_limits": { 00:26:14.961 "rw_ios_per_sec": 0, 00:26:14.961 "rw_mbytes_per_sec": 0, 00:26:14.961 "r_mbytes_per_sec": 0, 00:26:14.961 "w_mbytes_per_sec": 0 00:26:14.961 }, 00:26:14.961 "claimed": false, 00:26:14.961 "zoned": false, 00:26:14.961 "supported_io_types": { 00:26:14.961 "read": true, 00:26:14.961 "write": true, 00:26:14.961 "unmap": true, 00:26:14.961 "write_zeroes": true, 00:26:14.961 "flush": true, 00:26:14.961 "reset": true, 00:26:14.961 "compare": false, 00:26:14.961 "compare_and_write": false, 00:26:14.961 "abort": true, 00:26:14.961 "nvme_admin": true, 00:26:14.961 "nvme_io": true 00:26:14.961 }, 00:26:14.961 "driver_specific": { 00:26:14.961 "nvme": [ 00:26:14.961 { 00:26:14.961 "pci_address": "0000:81:00.0", 00:26:14.961 "trid": { 00:26:14.961 "trtype": "PCIe", 00:26:14.961 "traddr": "0000:81:00.0" 00:26:14.961 }, 00:26:14.961 "ctrlr_data": { 00:26:14.961 "cntlid": 0, 00:26:14.961 "vendor_id": "0x8086", 00:26:14.961 "model_number": "INTEL SSDPE2KX020T8", 00:26:14.961 "serial_number": "PHLJ951302VM2P0BGN", 00:26:14.961 "firmware_revision": "VDV10184", 00:26:14.961 "oacs": { 00:26:14.961 "security": 0, 00:26:14.961 "format": 1, 00:26:14.961 "firmware": 1, 00:26:14.961 "ns_manage": 1 00:26:14.961 }, 00:26:14.961 "multi_ctrlr": false, 00:26:14.961 "ana_reporting": false 00:26:14.961 }, 00:26:14.961 "vs": { 00:26:14.961 "nvme_version": "1.2" 00:26:14.961 }, 00:26:14.961 "ns_data": { 00:26:14.961 "id": 1, 00:26:14.961 "can_share": false 00:26:14.961 } 00:26:14.961 } 00:26:14.961 ], 00:26:14.961 "mp_policy": "active_passive" 00:26:14.961 } 00:26:14.961 } 00:26:14.961 ] 00:26:14.961 04:27:02 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:14.962 04:27:02 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:16.329 57b6dfe4-1b76-459e-9a0e-69aa6440bfad 00:26:16.329 04:27:04 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:16.329 17cdb8fa-bee4-4e30-812c-96992c0a0cbf 00:26:16.329 04:27:04 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:16.329 04:27:04 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:16.586 04:27:04 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:16.842 [ 00:26:16.842 { 00:26:16.842 "name": "17cdb8fa-bee4-4e30-812c-96992c0a0cbf", 00:26:16.842 "aliases": [ 00:26:16.842 "lvs0/lv0" 00:26:16.842 ], 00:26:16.842 "product_name": "Logical Volume", 00:26:16.842 "block_size": 512, 00:26:16.842 "num_blocks": 204800, 00:26:16.842 "uuid": "17cdb8fa-bee4-4e30-812c-96992c0a0cbf", 00:26:16.842 "assigned_rate_limits": { 00:26:16.842 "rw_ios_per_sec": 0, 00:26:16.842 "rw_mbytes_per_sec": 0, 00:26:16.842 "r_mbytes_per_sec": 0, 00:26:16.842 "w_mbytes_per_sec": 0 00:26:16.842 }, 00:26:16.843 "claimed": false, 00:26:16.843 "zoned": false, 00:26:16.843 "supported_io_types": { 00:26:16.843 "read": true, 00:26:16.843 "write": true, 00:26:16.843 "unmap": true, 00:26:16.843 "write_zeroes": true, 00:26:16.843 "flush": false, 00:26:16.843 "reset": true, 00:26:16.843 "compare": false, 00:26:16.843 "compare_and_write": false, 00:26:16.843 "abort": false, 00:26:16.843 "nvme_admin": false, 00:26:16.843 "nvme_io": false 00:26:16.843 }, 00:26:16.843 "driver_specific": { 00:26:16.843 "lvol": { 00:26:16.843 "lvol_store_uuid": "57b6dfe4-1b76-459e-9a0e-69aa6440bfad", 00:26:16.843 "base_bdev": "Nvme0n1", 00:26:16.843 "thin_provision": true, 00:26:16.843 "num_allocated_clusters": 0, 00:26:16.843 "snapshot": false, 00:26:16.843 "clone": false, 00:26:16.843 "esnap_clone": false 00:26:16.843 } 00:26:16.843 } 00:26:16.843 } 00:26:16.843 ] 00:26:16.843 04:27:04 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:16.843 04:27:04 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:26:16.843 04:27:04 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:26:17.099 [2024-05-15 04:27:04.957474] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:17.099 COMP_lvs0/lv0 00:26:17.099 04:27:04 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:17.099 04:27:04 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:17.100 04:27:04 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:17.100 04:27:04 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:17.100 04:27:04 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:17.100 04:27:04 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:17.100 04:27:04 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:17.356 04:27:05 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:17.612 [ 00:26:17.612 { 00:26:17.612 "name": "COMP_lvs0/lv0", 00:26:17.612 "aliases": [ 00:26:17.612 "42d5a550-d72e-5d45-8f37-bdbf5dea380a" 00:26:17.612 ], 00:26:17.612 "product_name": "compress", 00:26:17.612 "block_size": 4096, 00:26:17.612 "num_blocks": 25088, 00:26:17.612 "uuid": "42d5a550-d72e-5d45-8f37-bdbf5dea380a", 00:26:17.612 "assigned_rate_limits": { 00:26:17.612 "rw_ios_per_sec": 0, 00:26:17.612 "rw_mbytes_per_sec": 0, 00:26:17.613 "r_mbytes_per_sec": 0, 00:26:17.613 "w_mbytes_per_sec": 0 00:26:17.613 }, 00:26:17.613 "claimed": false, 00:26:17.613 "zoned": false, 00:26:17.613 "supported_io_types": { 00:26:17.613 "read": true, 00:26:17.613 "write": true, 00:26:17.613 "unmap": false, 00:26:17.613 "write_zeroes": true, 00:26:17.613 "flush": false, 00:26:17.613 "reset": false, 00:26:17.613 "compare": false, 00:26:17.613 "compare_and_write": false, 00:26:17.613 "abort": false, 00:26:17.613 "nvme_admin": false, 00:26:17.613 "nvme_io": false 00:26:17.613 }, 00:26:17.613 "driver_specific": { 00:26:17.613 "compress": { 00:26:17.613 "name": "COMP_lvs0/lv0", 00:26:17.613 "base_bdev_name": "17cdb8fa-bee4-4e30-812c-96992c0a0cbf" 00:26:17.613 } 00:26:17.613 } 00:26:17.613 } 00:26:17.613 ] 00:26:17.613 04:27:05 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:17.613 04:27:05 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:26:17.613 Running I/O for 3 seconds... 00:26:20.887 00:26:20.887 Latency(us) 00:26:20.887 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:20.887 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:26:20.887 Verification LBA range: start 0x0 length 0x3100 00:26:20.887 COMP_lvs0/lv0 : 3.01 2604.23 10.17 0.00 0.00 12235.47 89.51 19223.89 00:26:20.887 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:26:20.887 Verification LBA range: start 0x3100 length 0x3100 00:26:20.887 COMP_lvs0/lv0 : 3.01 2646.76 10.34 0.00 0.00 12040.78 89.88 20486.07 00:26:20.887 =================================================================================================================== 00:26:20.887 Total : 5250.99 20.51 0.00 0.00 12137.36 89.51 20486.07 00:26:20.887 0 00:26:20.887 04:27:08 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:26:20.887 04:27:08 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:20.887 04:27:08 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:21.144 04:27:09 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:26:21.144 04:27:09 compress_isal -- compress/compress.sh@78 -- # killprocess 3970289 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 3970289 ']' 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@950 -- # kill -0 3970289 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@951 -- # uname 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3970289 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3970289' 00:26:21.144 killing process with pid 3970289 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@965 -- # kill 3970289 00:26:21.144 Received shutdown signal, test time was about 3.000000 seconds 00:26:21.144 00:26:21.144 Latency(us) 00:26:21.144 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:21.144 =================================================================================================================== 00:26:21.144 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:21.144 04:27:09 compress_isal -- common/autotest_common.sh@970 -- # wait 3970289 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=3972510 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:26:23.672 04:27:11 compress_isal -- compress/compress.sh@57 -- # waitforlisten 3972510 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@827 -- # '[' -z 3972510 ']' 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:23.672 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:23.672 04:27:11 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:23.672 [2024-05-15 04:27:11.597200] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:23.672 [2024-05-15 04:27:11.597281] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3972510 ] 00:26:23.672 [2024-05-15 04:27:11.679906] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:23.931 [2024-05-15 04:27:11.796592] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:23.931 [2024-05-15 04:27:11.796655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:23.931 [2024-05-15 04:27:11.796658] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:24.863 04:27:12 compress_isal -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:24.863 04:27:12 compress_isal -- common/autotest_common.sh@860 -- # return 0 00:26:24.863 04:27:12 compress_isal -- compress/compress.sh@58 -- # create_vols 00:26:24.863 04:27:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:26:24.863 04:27:12 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:26:28.140 04:27:15 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=Nvme0n1 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:28.140 04:27:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:26:28.140 [ 00:26:28.140 { 00:26:28.140 "name": "Nvme0n1", 00:26:28.140 "aliases": [ 00:26:28.140 "c6405e6d-2739-4bb2-907f-c23372a8249b" 00:26:28.140 ], 00:26:28.140 "product_name": "NVMe disk", 00:26:28.140 "block_size": 512, 00:26:28.140 "num_blocks": 3907029168, 00:26:28.140 "uuid": "c6405e6d-2739-4bb2-907f-c23372a8249b", 00:26:28.140 "assigned_rate_limits": { 00:26:28.140 "rw_ios_per_sec": 0, 00:26:28.140 "rw_mbytes_per_sec": 0, 00:26:28.140 "r_mbytes_per_sec": 0, 00:26:28.140 "w_mbytes_per_sec": 0 00:26:28.140 }, 00:26:28.140 "claimed": false, 00:26:28.140 "zoned": false, 00:26:28.140 "supported_io_types": { 00:26:28.140 "read": true, 00:26:28.140 "write": true, 00:26:28.140 "unmap": true, 00:26:28.140 "write_zeroes": true, 00:26:28.140 "flush": true, 00:26:28.140 "reset": true, 00:26:28.140 "compare": false, 00:26:28.140 "compare_and_write": false, 00:26:28.140 "abort": true, 00:26:28.140 "nvme_admin": true, 00:26:28.140 "nvme_io": true 00:26:28.140 }, 00:26:28.140 "driver_specific": { 00:26:28.140 "nvme": [ 00:26:28.140 { 00:26:28.140 "pci_address": "0000:81:00.0", 00:26:28.140 "trid": { 00:26:28.140 "trtype": "PCIe", 00:26:28.140 "traddr": "0000:81:00.0" 00:26:28.140 }, 00:26:28.140 "ctrlr_data": { 00:26:28.140 "cntlid": 0, 00:26:28.140 "vendor_id": "0x8086", 00:26:28.140 "model_number": "INTEL SSDPE2KX020T8", 00:26:28.140 "serial_number": "PHLJ951302VM2P0BGN", 00:26:28.140 "firmware_revision": "VDV10184", 00:26:28.140 "oacs": { 00:26:28.140 "security": 0, 00:26:28.140 "format": 1, 00:26:28.140 "firmware": 1, 00:26:28.140 "ns_manage": 1 00:26:28.140 }, 00:26:28.140 "multi_ctrlr": false, 00:26:28.140 "ana_reporting": false 00:26:28.140 }, 00:26:28.140 "vs": { 00:26:28.140 "nvme_version": "1.2" 00:26:28.140 }, 00:26:28.140 "ns_data": { 00:26:28.140 "id": 1, 00:26:28.140 "can_share": false 00:26:28.140 } 00:26:28.140 } 00:26:28.140 ], 00:26:28.140 "mp_policy": "active_passive" 00:26:28.140 } 00:26:28.140 } 00:26:28.140 ] 00:26:28.140 04:27:16 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:28.140 04:27:16 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:26:29.512 23f90373-408a-4d2c-abd6-fc104932463d 00:26:29.512 04:27:17 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:26:29.769 a17afd1d-5c1e-4a65-b7d8-39d9f9531eb2 00:26:29.769 04:27:17 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=lvs0/lv0 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:29.769 04:27:17 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:30.026 04:27:17 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:26:30.283 [ 00:26:30.283 { 00:26:30.283 "name": "a17afd1d-5c1e-4a65-b7d8-39d9f9531eb2", 00:26:30.283 "aliases": [ 00:26:30.283 "lvs0/lv0" 00:26:30.283 ], 00:26:30.283 "product_name": "Logical Volume", 00:26:30.283 "block_size": 512, 00:26:30.283 "num_blocks": 204800, 00:26:30.283 "uuid": "a17afd1d-5c1e-4a65-b7d8-39d9f9531eb2", 00:26:30.283 "assigned_rate_limits": { 00:26:30.283 "rw_ios_per_sec": 0, 00:26:30.283 "rw_mbytes_per_sec": 0, 00:26:30.283 "r_mbytes_per_sec": 0, 00:26:30.283 "w_mbytes_per_sec": 0 00:26:30.283 }, 00:26:30.283 "claimed": false, 00:26:30.283 "zoned": false, 00:26:30.283 "supported_io_types": { 00:26:30.283 "read": true, 00:26:30.283 "write": true, 00:26:30.283 "unmap": true, 00:26:30.283 "write_zeroes": true, 00:26:30.283 "flush": false, 00:26:30.283 "reset": true, 00:26:30.283 "compare": false, 00:26:30.283 "compare_and_write": false, 00:26:30.283 "abort": false, 00:26:30.283 "nvme_admin": false, 00:26:30.283 "nvme_io": false 00:26:30.283 }, 00:26:30.283 "driver_specific": { 00:26:30.283 "lvol": { 00:26:30.283 "lvol_store_uuid": "23f90373-408a-4d2c-abd6-fc104932463d", 00:26:30.283 "base_bdev": "Nvme0n1", 00:26:30.283 "thin_provision": true, 00:26:30.283 "num_allocated_clusters": 0, 00:26:30.283 "snapshot": false, 00:26:30.283 "clone": false, 00:26:30.283 "esnap_clone": false 00:26:30.283 } 00:26:30.283 } 00:26:30.283 } 00:26:30.283 ] 00:26:30.283 04:27:18 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:30.283 04:27:18 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:26:30.283 04:27:18 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:26:30.541 [2024-05-15 04:27:18.368096] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:26:30.541 COMP_lvs0/lv0 00:26:30.541 04:27:18 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@895 -- # local bdev_name=COMP_lvs0/lv0 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@897 -- # local i 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:26:30.541 04:27:18 compress_isal -- common/autotest_common.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:26:30.798 04:27:18 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:26:31.056 [ 00:26:31.056 { 00:26:31.056 "name": "COMP_lvs0/lv0", 00:26:31.056 "aliases": [ 00:26:31.056 "2a8d088f-1f6c-5c39-bf02-3073c55cbf08" 00:26:31.056 ], 00:26:31.056 "product_name": "compress", 00:26:31.056 "block_size": 512, 00:26:31.056 "num_blocks": 200704, 00:26:31.056 "uuid": "2a8d088f-1f6c-5c39-bf02-3073c55cbf08", 00:26:31.056 "assigned_rate_limits": { 00:26:31.056 "rw_ios_per_sec": 0, 00:26:31.056 "rw_mbytes_per_sec": 0, 00:26:31.056 "r_mbytes_per_sec": 0, 00:26:31.056 "w_mbytes_per_sec": 0 00:26:31.056 }, 00:26:31.056 "claimed": false, 00:26:31.056 "zoned": false, 00:26:31.056 "supported_io_types": { 00:26:31.056 "read": true, 00:26:31.056 "write": true, 00:26:31.056 "unmap": false, 00:26:31.056 "write_zeroes": true, 00:26:31.056 "flush": false, 00:26:31.056 "reset": false, 00:26:31.056 "compare": false, 00:26:31.056 "compare_and_write": false, 00:26:31.056 "abort": false, 00:26:31.056 "nvme_admin": false, 00:26:31.056 "nvme_io": false 00:26:31.056 }, 00:26:31.056 "driver_specific": { 00:26:31.056 "compress": { 00:26:31.056 "name": "COMP_lvs0/lv0", 00:26:31.056 "base_bdev_name": "a17afd1d-5c1e-4a65-b7d8-39d9f9531eb2" 00:26:31.056 } 00:26:31.056 } 00:26:31.056 } 00:26:31.056 ] 00:26:31.056 04:27:18 compress_isal -- common/autotest_common.sh@903 -- # return 0 00:26:31.056 04:27:18 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:31.056 I/O targets: 00:26:31.056 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:26:31.056 00:26:31.056 00:26:31.056 CUnit - A unit testing framework for C - Version 2.1-3 00:26:31.056 http://cunit.sourceforge.net/ 00:26:31.056 00:26:31.056 00:26:31.056 Suite: bdevio tests on: COMP_lvs0/lv0 00:26:31.056 Test: blockdev write read block ...passed 00:26:31.056 Test: blockdev write zeroes read block ...passed 00:26:31.056 Test: blockdev write zeroes read no split ...passed 00:26:31.056 Test: blockdev write zeroes read split ...passed 00:26:31.056 Test: blockdev write zeroes read split partial ...passed 00:26:31.056 Test: blockdev reset ...[2024-05-15 04:27:19.020272] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:26:31.056 passed 00:26:31.056 Test: blockdev write read 8 blocks ...passed 00:26:31.056 Test: blockdev write read size > 128k ...passed 00:26:31.056 Test: blockdev write read invalid size ...passed 00:26:31.056 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:31.056 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:31.056 Test: blockdev write read max offset ...passed 00:26:31.056 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:31.056 Test: blockdev writev readv 8 blocks ...passed 00:26:31.056 Test: blockdev writev readv 30 x 1block ...passed 00:26:31.056 Test: blockdev writev readv block ...passed 00:26:31.056 Test: blockdev writev readv size > 128k ...passed 00:26:31.056 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:31.056 Test: blockdev comparev and writev ...passed 00:26:31.056 Test: blockdev nvme passthru rw ...passed 00:26:31.056 Test: blockdev nvme passthru vendor specific ...passed 00:26:31.056 Test: blockdev nvme admin passthru ...passed 00:26:31.056 Test: blockdev copy ...passed 00:26:31.056 00:26:31.056 Run Summary: Type Total Ran Passed Failed Inactive 00:26:31.056 suites 1 1 n/a 0 0 00:26:31.056 tests 23 23 23 0 0 00:26:31.056 asserts 130 130 130 0 n/a 00:26:31.056 00:26:31.056 Elapsed time = 0.226 seconds 00:26:31.056 0 00:26:31.313 04:27:19 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:26:31.313 04:27:19 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:26:31.314 04:27:19 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:26:31.571 04:27:19 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:26:31.571 04:27:19 compress_isal -- compress/compress.sh@62 -- # killprocess 3972510 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@946 -- # '[' -z 3972510 ']' 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@950 -- # kill -0 3972510 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@951 -- # uname 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3972510 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:31.571 04:27:19 compress_isal -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3972510' 00:26:31.571 killing process with pid 3972510 00:26:31.828 04:27:19 compress_isal -- common/autotest_common.sh@965 -- # kill 3972510 00:26:31.828 04:27:19 compress_isal -- common/autotest_common.sh@970 -- # wait 3972510 00:26:34.351 04:27:21 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:26:34.351 04:27:21 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:26:34.351 00:26:34.351 real 0m51.006s 00:26:34.351 user 1m57.721s 00:26:34.351 sys 0m3.481s 00:26:34.351 04:27:21 compress_isal -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:34.351 04:27:21 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:26:34.351 ************************************ 00:26:34.351 END TEST compress_isal 00:26:34.351 ************************************ 00:26:34.351 04:27:21 -- spdk/autotest.sh@348 -- # '[' 0 -eq 1 ']' 00:26:34.351 04:27:21 -- spdk/autotest.sh@352 -- # '[' 1 -eq 1 ']' 00:26:34.351 04:27:21 -- spdk/autotest.sh@353 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:26:34.351 04:27:21 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:34.351 04:27:21 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:34.351 04:27:21 -- common/autotest_common.sh@10 -- # set +x 00:26:34.351 ************************************ 00:26:34.351 START TEST blockdev_crypto_aesni 00:26:34.351 ************************************ 00:26:34.351 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:26:34.351 * Looking for test storage... 00:26:34.351 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:26:34.351 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3973780 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:26:34.352 04:27:22 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 3973780 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@827 -- # '[' -z 3973780 ']' 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:34.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:34.352 04:27:22 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:34.352 [2024-05-15 04:27:22.118877] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:34.352 [2024-05-15 04:27:22.118955] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3973780 ] 00:26:34.352 [2024-05-15 04:27:22.195226] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:34.352 [2024-05-15 04:27:22.304556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:35.298 04:27:23 blockdev_crypto_aesni -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:35.298 04:27:23 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # return 0 00:26:35.298 04:27:23 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:26:35.299 04:27:23 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:26:35.299 04:27:23 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:26:35.299 04:27:23 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:35.299 04:27:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:35.299 [2024-05-15 04:27:23.046926] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:35.299 [2024-05-15 04:27:23.054954] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:35.299 [2024-05-15 04:27:23.062983] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:35.299 [2024-05-15 04:27:23.143487] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:37.832 true 00:26:37.832 true 00:26:37.832 true 00:26:37.832 true 00:26:37.832 Malloc0 00:26:37.832 Malloc1 00:26:37.832 Malloc2 00:26:37.832 Malloc3 00:26:37.832 [2024-05-15 04:27:25.741670] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:37.832 crypto_ram 00:26:37.832 [2024-05-15 04:27:25.749684] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:37.832 crypto_ram2 00:26:37.832 [2024-05-15 04:27:25.757701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:37.832 crypto_ram3 00:26:37.832 [2024-05-15 04:27:25.765721] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:37.832 crypto_ram4 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:26:37.832 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:26:37.832 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:38.089 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:26:38.089 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:26:38.089 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:26:38.090 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "faa734cf-c27f-52bf-97df-9cafcce04e6f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "faa734cf-c27f-52bf-97df-9cafcce04e6f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "176c5e10-f83b-5181-a786-0adc7fb48ac7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "176c5e10-f83b-5181-a786-0adc7fb48ac7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:26:38.090 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:26:38.090 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:26:38.090 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:26:38.090 04:27:25 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 3973780 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@946 -- # '[' -z 3973780 ']' 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # kill -0 3973780 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # uname 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3973780 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3973780' 00:26:38.090 killing process with pid 3973780 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@965 -- # kill 3973780 00:26:38.090 04:27:25 blockdev_crypto_aesni -- common/autotest_common.sh@970 -- # wait 3973780 00:26:38.654 04:27:26 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:26:38.654 04:27:26 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:38.654 04:27:26 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:26:38.654 04:27:26 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:38.654 04:27:26 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:38.654 ************************************ 00:26:38.654 START TEST bdev_hello_world 00:26:38.654 ************************************ 00:26:38.654 04:27:26 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:26:38.654 [2024-05-15 04:27:26.596290] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:38.654 [2024-05-15 04:27:26.596362] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3974326 ] 00:26:38.912 [2024-05-15 04:27:26.677596] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:38.912 [2024-05-15 04:27:26.800247] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:38.912 [2024-05-15 04:27:26.821522] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:38.912 [2024-05-15 04:27:26.829548] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:38.912 [2024-05-15 04:27:26.837600] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:39.170 [2024-05-15 04:27:26.955277] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:41.696 [2024-05-15 04:27:29.357057] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:41.696 [2024-05-15 04:27:29.357151] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:41.696 [2024-05-15 04:27:29.357181] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:41.696 [2024-05-15 04:27:29.365071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:41.696 [2024-05-15 04:27:29.365101] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:41.696 [2024-05-15 04:27:29.365116] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:41.696 [2024-05-15 04:27:29.373091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:41.696 [2024-05-15 04:27:29.373120] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:41.696 [2024-05-15 04:27:29.373135] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:41.696 [2024-05-15 04:27:29.381111] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:41.696 [2024-05-15 04:27:29.381139] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:41.696 [2024-05-15 04:27:29.381154] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:41.696 [2024-05-15 04:27:29.465688] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:26:41.696 [2024-05-15 04:27:29.465744] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:26:41.696 [2024-05-15 04:27:29.465766] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:26:41.696 [2024-05-15 04:27:29.467018] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:26:41.696 [2024-05-15 04:27:29.467106] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:26:41.696 [2024-05-15 04:27:29.467130] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:26:41.697 [2024-05-15 04:27:29.467194] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:26:41.697 00:26:41.697 [2024-05-15 04:27:29.467222] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:26:41.954 00:26:41.954 real 0m3.362s 00:26:41.954 user 0m2.821s 00:26:41.954 sys 0m0.503s 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:26:41.954 ************************************ 00:26:41.954 END TEST bdev_hello_world 00:26:41.954 ************************************ 00:26:41.954 04:27:29 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:26:41.954 04:27:29 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:41.954 04:27:29 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:41.954 04:27:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:41.954 ************************************ 00:26:41.954 START TEST bdev_bounds 00:26:41.954 ************************************ 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3974742 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3974742' 00:26:41.954 Process bdevio pid: 3974742 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3974742 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 3974742 ']' 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:26:41.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:41.954 04:27:29 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:42.211 [2024-05-15 04:27:30.007171] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:42.211 [2024-05-15 04:27:30.007236] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3974742 ] 00:26:42.211 [2024-05-15 04:27:30.088586] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:26:42.211 [2024-05-15 04:27:30.201381] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:26:42.211 [2024-05-15 04:27:30.201461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:26:42.211 [2024-05-15 04:27:30.201464] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:42.211 [2024-05-15 04:27:30.222767] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:42.467 [2024-05-15 04:27:30.230791] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:42.467 [2024-05-15 04:27:30.238811] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:42.467 [2024-05-15 04:27:30.347509] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:45.107 [2024-05-15 04:27:32.714306] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:45.107 [2024-05-15 04:27:32.714398] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:45.107 [2024-05-15 04:27:32.714416] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:45.107 [2024-05-15 04:27:32.722322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:45.107 [2024-05-15 04:27:32.722347] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:45.107 [2024-05-15 04:27:32.722367] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:45.107 [2024-05-15 04:27:32.730345] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:45.107 [2024-05-15 04:27:32.730368] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:45.107 [2024-05-15 04:27:32.730385] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:45.108 [2024-05-15 04:27:32.738366] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:45.108 [2024-05-15 04:27:32.738389] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:45.108 [2024-05-15 04:27:32.738400] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:45.108 04:27:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:45.108 04:27:32 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:26:45.108 04:27:32 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:26:45.108 I/O targets: 00:26:45.108 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:26:45.108 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:26:45.108 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:26:45.108 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:26:45.108 00:26:45.108 00:26:45.108 CUnit - A unit testing framework for C - Version 2.1-3 00:26:45.108 http://cunit.sourceforge.net/ 00:26:45.108 00:26:45.108 00:26:45.108 Suite: bdevio tests on: crypto_ram4 00:26:45.108 Test: blockdev write read block ...passed 00:26:45.108 Test: blockdev write zeroes read block ...passed 00:26:45.108 Test: blockdev write zeroes read no split ...passed 00:26:45.108 Test: blockdev write zeroes read split ...passed 00:26:45.108 Test: blockdev write zeroes read split partial ...passed 00:26:45.108 Test: blockdev reset ...passed 00:26:45.108 Test: blockdev write read 8 blocks ...passed 00:26:45.108 Test: blockdev write read size > 128k ...passed 00:26:45.108 Test: blockdev write read invalid size ...passed 00:26:45.108 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:45.108 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:45.108 Test: blockdev write read max offset ...passed 00:26:45.108 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:45.108 Test: blockdev writev readv 8 blocks ...passed 00:26:45.108 Test: blockdev writev readv 30 x 1block ...passed 00:26:45.108 Test: blockdev writev readv block ...passed 00:26:45.108 Test: blockdev writev readv size > 128k ...passed 00:26:45.108 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:45.108 Test: blockdev comparev and writev ...passed 00:26:45.108 Test: blockdev nvme passthru rw ...passed 00:26:45.108 Test: blockdev nvme passthru vendor specific ...passed 00:26:45.108 Test: blockdev nvme admin passthru ...passed 00:26:45.108 Test: blockdev copy ...passed 00:26:45.108 Suite: bdevio tests on: crypto_ram3 00:26:45.108 Test: blockdev write read block ...passed 00:26:45.108 Test: blockdev write zeroes read block ...passed 00:26:45.108 Test: blockdev write zeroes read no split ...passed 00:26:45.108 Test: blockdev write zeroes read split ...passed 00:26:45.108 Test: blockdev write zeroes read split partial ...passed 00:26:45.108 Test: blockdev reset ...passed 00:26:45.108 Test: blockdev write read 8 blocks ...passed 00:26:45.108 Test: blockdev write read size > 128k ...passed 00:26:45.108 Test: blockdev write read invalid size ...passed 00:26:45.108 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:45.108 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:45.108 Test: blockdev write read max offset ...passed 00:26:45.108 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:45.108 Test: blockdev writev readv 8 blocks ...passed 00:26:45.108 Test: blockdev writev readv 30 x 1block ...passed 00:26:45.108 Test: blockdev writev readv block ...passed 00:26:45.108 Test: blockdev writev readv size > 128k ...passed 00:26:45.108 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:45.108 Test: blockdev comparev and writev ...passed 00:26:45.108 Test: blockdev nvme passthru rw ...passed 00:26:45.108 Test: blockdev nvme passthru vendor specific ...passed 00:26:45.108 Test: blockdev nvme admin passthru ...passed 00:26:45.108 Test: blockdev copy ...passed 00:26:45.108 Suite: bdevio tests on: crypto_ram2 00:26:45.108 Test: blockdev write read block ...passed 00:26:45.108 Test: blockdev write zeroes read block ...passed 00:26:45.108 Test: blockdev write zeroes read no split ...passed 00:26:45.108 Test: blockdev write zeroes read split ...passed 00:26:45.366 Test: blockdev write zeroes read split partial ...passed 00:26:45.366 Test: blockdev reset ...passed 00:26:45.366 Test: blockdev write read 8 blocks ...passed 00:26:45.366 Test: blockdev write read size > 128k ...passed 00:26:45.366 Test: blockdev write read invalid size ...passed 00:26:45.366 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:45.366 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:45.366 Test: blockdev write read max offset ...passed 00:26:45.366 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:45.366 Test: blockdev writev readv 8 blocks ...passed 00:26:45.366 Test: blockdev writev readv 30 x 1block ...passed 00:26:45.366 Test: blockdev writev readv block ...passed 00:26:45.366 Test: blockdev writev readv size > 128k ...passed 00:26:45.366 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:45.366 Test: blockdev comparev and writev ...passed 00:26:45.366 Test: blockdev nvme passthru rw ...passed 00:26:45.366 Test: blockdev nvme passthru vendor specific ...passed 00:26:45.366 Test: blockdev nvme admin passthru ...passed 00:26:45.366 Test: blockdev copy ...passed 00:26:45.366 Suite: bdevio tests on: crypto_ram 00:26:45.366 Test: blockdev write read block ...passed 00:26:45.366 Test: blockdev write zeroes read block ...passed 00:26:45.366 Test: blockdev write zeroes read no split ...passed 00:26:45.366 Test: blockdev write zeroes read split ...passed 00:26:45.366 Test: blockdev write zeroes read split partial ...passed 00:26:45.366 Test: blockdev reset ...passed 00:26:45.366 Test: blockdev write read 8 blocks ...passed 00:26:45.366 Test: blockdev write read size > 128k ...passed 00:26:45.366 Test: blockdev write read invalid size ...passed 00:26:45.366 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:26:45.366 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:26:45.366 Test: blockdev write read max offset ...passed 00:26:45.366 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:26:45.366 Test: blockdev writev readv 8 blocks ...passed 00:26:45.366 Test: blockdev writev readv 30 x 1block ...passed 00:26:45.366 Test: blockdev writev readv block ...passed 00:26:45.366 Test: blockdev writev readv size > 128k ...passed 00:26:45.366 Test: blockdev writev readv size > 128k in two iovs ...passed 00:26:45.366 Test: blockdev comparev and writev ...passed 00:26:45.366 Test: blockdev nvme passthru rw ...passed 00:26:45.366 Test: blockdev nvme passthru vendor specific ...passed 00:26:45.366 Test: blockdev nvme admin passthru ...passed 00:26:45.366 Test: blockdev copy ...passed 00:26:45.366 00:26:45.366 Run Summary: Type Total Ran Passed Failed Inactive 00:26:45.366 suites 4 4 n/a 0 0 00:26:45.366 tests 92 92 92 0 0 00:26:45.366 asserts 520 520 520 0 n/a 00:26:45.366 00:26:45.366 Elapsed time = 0.636 seconds 00:26:45.366 0 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3974742 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 3974742 ']' 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 3974742 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3974742 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3974742' 00:26:45.366 killing process with pid 3974742 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@965 -- # kill 3974742 00:26:45.366 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@970 -- # wait 3974742 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:26:45.931 00:26:45.931 real 0m3.792s 00:26:45.931 user 0m10.560s 00:26:45.931 sys 0m0.637s 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:26:45.931 ************************************ 00:26:45.931 END TEST bdev_bounds 00:26:45.931 ************************************ 00:26:45.931 04:27:33 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:26:45.931 04:27:33 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:26:45.931 04:27:33 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:45.931 04:27:33 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:45.931 ************************************ 00:26:45.931 START TEST bdev_nbd 00:26:45.931 ************************************ 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3975250 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3975250 /var/tmp/spdk-nbd.sock 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 3975250 ']' 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:26:45.931 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:26:45.931 04:27:33 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:45.931 [2024-05-15 04:27:33.857006] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:26:45.931 [2024-05-15 04:27:33.857081] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:45.931 [2024-05-15 04:27:33.931868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:46.189 [2024-05-15 04:27:34.038338] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:26:46.189 [2024-05-15 04:27:34.059461] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:26:46.189 [2024-05-15 04:27:34.067480] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:26:46.189 [2024-05-15 04:27:34.075500] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:26:46.189 [2024-05-15 04:27:34.183958] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:26:48.713 [2024-05-15 04:27:36.578877] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:26:48.713 [2024-05-15 04:27:36.578976] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:26:48.713 [2024-05-15 04:27:36.578997] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.713 [2024-05-15 04:27:36.586895] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:26:48.713 [2024-05-15 04:27:36.586924] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:26:48.713 [2024-05-15 04:27:36.586939] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.713 [2024-05-15 04:27:36.594915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:26:48.713 [2024-05-15 04:27:36.594943] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:26:48.713 [2024-05-15 04:27:36.594958] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.713 [2024-05-15 04:27:36.602936] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:26:48.713 [2024-05-15 04:27:36.602963] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:26:48.713 [2024-05-15 04:27:36.602977] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:26:48.713 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:26:48.714 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:48.714 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:48.971 1+0 records in 00:26:48.971 1+0 records out 00:26:48.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00102233 s, 4.0 MB/s 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:48.971 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.229 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:49.229 04:27:36 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:49.229 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:49.229 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:49.229 04:27:36 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.488 1+0 records in 00:26:49.488 1+0 records out 00:26:49.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242332 s, 16.9 MB/s 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:49.488 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:49.746 1+0 records in 00:26:49.746 1+0 records out 00:26:49.746 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200467 s, 20.4 MB/s 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:49.746 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:50.004 1+0 records in 00:26:50.004 1+0 records out 00:26:50.004 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289414 s, 14.2 MB/s 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:26:50.004 04:27:37 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:50.261 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd0", 00:26:50.261 "bdev_name": "crypto_ram" 00:26:50.261 }, 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd1", 00:26:50.261 "bdev_name": "crypto_ram2" 00:26:50.261 }, 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd2", 00:26:50.261 "bdev_name": "crypto_ram3" 00:26:50.261 }, 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd3", 00:26:50.261 "bdev_name": "crypto_ram4" 00:26:50.261 } 00:26:50.261 ]' 00:26:50.261 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:26:50.261 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd0", 00:26:50.261 "bdev_name": "crypto_ram" 00:26:50.261 }, 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd1", 00:26:50.261 "bdev_name": "crypto_ram2" 00:26:50.261 }, 00:26:50.261 { 00:26:50.261 "nbd_device": "/dev/nbd2", 00:26:50.261 "bdev_name": "crypto_ram3" 00:26:50.262 }, 00:26:50.262 { 00:26:50.262 "nbd_device": "/dev/nbd3", 00:26:50.262 "bdev_name": "crypto_ram4" 00:26:50.262 } 00:26:50.262 ]' 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.262 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.519 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:50.776 04:27:38 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:51.035 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:51.601 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:51.602 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:51.859 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:26:52.117 /dev/nbd0 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:52.117 1+0 records in 00:26:52.117 1+0 records out 00:26:52.117 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237837 s, 17.2 MB/s 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:52.117 04:27:39 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:26:52.375 /dev/nbd1 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:52.375 1+0 records in 00:26:52.375 1+0 records out 00:26:52.375 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258205 s, 15.9 MB/s 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:52.375 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:26:52.683 /dev/nbd10 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:52.683 1+0 records in 00:26:52.683 1+0 records out 00:26:52.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289884 s, 14.1 MB/s 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:52.683 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:26:52.942 /dev/nbd11 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:52.942 1+0 records in 00:26:52.942 1+0 records out 00:26:52.942 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252428 s, 16.2 MB/s 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:52.942 04:27:40 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd0", 00:26:53.200 "bdev_name": "crypto_ram" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd1", 00:26:53.200 "bdev_name": "crypto_ram2" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd10", 00:26:53.200 "bdev_name": "crypto_ram3" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd11", 00:26:53.200 "bdev_name": "crypto_ram4" 00:26:53.200 } 00:26:53.200 ]' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd0", 00:26:53.200 "bdev_name": "crypto_ram" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd1", 00:26:53.200 "bdev_name": "crypto_ram2" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd10", 00:26:53.200 "bdev_name": "crypto_ram3" 00:26:53.200 }, 00:26:53.200 { 00:26:53.200 "nbd_device": "/dev/nbd11", 00:26:53.200 "bdev_name": "crypto_ram4" 00:26:53.200 } 00:26:53.200 ]' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:26:53.200 /dev/nbd1 00:26:53.200 /dev/nbd10 00:26:53.200 /dev/nbd11' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:26:53.200 /dev/nbd1 00:26:53.200 /dev/nbd10 00:26:53.200 /dev/nbd11' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:26:53.200 256+0 records in 00:26:53.200 256+0 records out 00:26:53.200 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00514221 s, 204 MB/s 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:53.200 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:26:53.458 256+0 records in 00:26:53.459 256+0 records out 00:26:53.459 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0536546 s, 19.5 MB/s 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:26:53.459 256+0 records in 00:26:53.459 256+0 records out 00:26:53.459 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0504801 s, 20.8 MB/s 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:26:53.459 256+0 records in 00:26:53.459 256+0 records out 00:26:53.459 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0518377 s, 20.2 MB/s 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:26:53.459 256+0 records in 00:26:53.459 256+0 records out 00:26:53.459 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0474725 s, 22.1 MB/s 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:53.459 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:53.717 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:53.974 04:27:41 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:26:54.231 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:54.232 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:54.489 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:26:55.054 04:27:42 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:26:55.054 malloc_lvol_verify 00:26:55.054 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:26:55.312 9cbc65e3-207b-4421-b8f0-f09b2a612691 00:26:55.570 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:26:55.570 642477d9-a871-4c3c-b56d-a2ad0f466ef4 00:26:55.570 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:26:56.135 /dev/nbd0 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:26:56.135 mke2fs 1.46.5 (30-Dec-2021) 00:26:56.135 Discarding device blocks: 0/4096 done 00:26:56.135 Creating filesystem with 4096 1k blocks and 1024 inodes 00:26:56.135 00:26:56.135 Allocating group tables: 0/1 done 00:26:56.135 Writing inode tables: 0/1 done 00:26:56.135 Creating journal (1024 blocks): done 00:26:56.135 Writing superblocks and filesystem accounting information: 0/1 done 00:26:56.135 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:56.135 04:27:43 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3975250 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 3975250 ']' 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 3975250 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3975250 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3975250' 00:26:56.392 killing process with pid 3975250 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@965 -- # kill 3975250 00:26:56.392 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@970 -- # wait 3975250 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:26:56.958 00:26:56.958 real 0m10.873s 00:26:56.958 user 0m14.515s 00:26:56.958 sys 0m3.929s 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:26:56.958 ************************************ 00:26:56.958 END TEST bdev_nbd 00:26:56.958 ************************************ 00:26:56.958 04:27:44 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:26:56.958 04:27:44 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:26:56.958 04:27:44 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:56.958 04:27:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:26:56.958 ************************************ 00:26:56.958 START TEST bdev_fio 00:26:56.958 ************************************ 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:26:56.958 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:26:56.958 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:26:56.959 ************************************ 00:26:56.959 START TEST bdev_fio_rw_verify 00:26:56.959 ************************************ 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:26:56.959 04:27:44 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:26:57.217 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:57.217 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:57.217 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:57.217 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:26:57.217 fio-3.35 00:26:57.217 Starting 4 threads 00:27:12.136 00:27:12.136 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3977139: Wed May 15 04:27:58 2024 00:27:12.136 read: IOPS=25.4k, BW=99.4MiB/s (104MB/s)(994MiB/10001msec) 00:27:12.136 slat (usec): min=13, max=456, avg=51.39, stdev=24.65 00:27:12.136 clat (usec): min=15, max=1140, avg=281.24, stdev=162.50 00:27:12.136 lat (usec): min=47, max=1199, avg=332.63, stdev=175.00 00:27:12.136 clat percentiles (usec): 00:27:12.136 | 50.000th=[ 249], 99.000th=[ 758], 99.900th=[ 881], 99.990th=[ 971], 00:27:12.136 | 99.999th=[ 1029] 00:27:12.136 write: IOPS=28.0k, BW=109MiB/s (115MB/s)(1066MiB/9754msec); 0 zone resets 00:27:12.136 slat (usec): min=23, max=1479, avg=62.66, stdev=24.51 00:27:12.136 clat (usec): min=30, max=2387, avg=347.36, stdev=194.97 00:27:12.136 lat (usec): min=54, max=2447, avg=410.02, stdev=206.94 00:27:12.136 clat percentiles (usec): 00:27:12.136 | 50.000th=[ 318], 99.000th=[ 947], 99.900th=[ 1123], 99.990th=[ 1254], 00:27:12.136 | 99.999th=[ 2073] 00:27:12.136 bw ( KiB/s): min=94976, max=132432, per=97.70%, avg=109323.37, stdev=2445.26, samples=76 00:27:12.136 iops : min=23744, max=33108, avg=27330.84, stdev=611.32, samples=76 00:27:12.136 lat (usec) : 20=0.01%, 50=0.01%, 100=7.46%, 250=34.80%, 500=43.15% 00:27:12.136 lat (usec) : 750=11.47%, 1000=2.82% 00:27:12.136 lat (msec) : 2=0.29%, 4=0.01% 00:27:12.136 cpu : usr=99.44%, sys=0.00%, ctx=61, majf=0, minf=335 00:27:12.136 IO depths : 1=10.0%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:12.136 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:12.136 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:12.136 issued rwts: total=254423,272872,0,0 short=0,0,0,0 dropped=0,0,0,0 00:27:12.136 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:12.136 00:27:12.136 Run status group 0 (all jobs): 00:27:12.136 READ: bw=99.4MiB/s (104MB/s), 99.4MiB/s-99.4MiB/s (104MB/s-104MB/s), io=994MiB (1042MB), run=10001-10001msec 00:27:12.136 WRITE: bw=109MiB/s (115MB/s), 109MiB/s-109MiB/s (115MB/s-115MB/s), io=1066MiB (1118MB), run=9754-9754msec 00:27:12.136 00:27:12.136 real 0m13.610s 00:27:12.136 user 0m43.108s 00:27:12.136 sys 0m0.562s 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:27:12.136 ************************************ 00:27:12.136 END TEST bdev_fio_rw_verify 00:27:12.136 ************************************ 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:27:12.136 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "faa734cf-c27f-52bf-97df-9cafcce04e6f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "faa734cf-c27f-52bf-97df-9cafcce04e6f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "176c5e10-f83b-5181-a786-0adc7fb48ac7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "176c5e10-f83b-5181-a786-0adc7fb48ac7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:27:12.137 crypto_ram2 00:27:12.137 crypto_ram3 00:27:12.137 crypto_ram4 ]] 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "faa734cf-c27f-52bf-97df-9cafcce04e6f"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "faa734cf-c27f-52bf-97df-9cafcce04e6f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f6a57a23-cf0e-57d9-a6dd-de74e7e13f20",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "d54eb04f-b9b4-57a2-94e7-d55fafd2adf1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "176c5e10-f83b-5181-a786-0adc7fb48ac7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "176c5e10-f83b-5181-a786-0adc7fb48ac7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:12.137 ************************************ 00:27:12.137 START TEST bdev_fio_trim 00:27:12.137 ************************************ 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:12.137 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:12.138 04:27:58 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:12.138 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:12.138 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:12.138 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:12.138 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:12.138 fio-3.35 00:27:12.138 Starting 4 threads 00:27:24.330 00:27:24.331 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3978814: Wed May 15 04:28:11 2024 00:27:24.331 write: IOPS=39.0k, BW=152MiB/s (160MB/s)(1523MiB/10001msec); 0 zone resets 00:27:24.331 slat (usec): min=13, max=499, avg=57.76, stdev=28.18 00:27:24.331 clat (usec): min=26, max=2083, avg=260.00, stdev=156.56 00:27:24.331 lat (usec): min=67, max=2180, avg=317.77, stdev=174.94 00:27:24.331 clat percentiles (usec): 00:27:24.331 | 50.000th=[ 223], 99.000th=[ 766], 99.900th=[ 881], 99.990th=[ 963], 00:27:24.331 | 99.999th=[ 1696] 00:27:24.331 bw ( KiB/s): min=142880, max=193488, per=99.41%, avg=155022.74, stdev=3190.68, samples=76 00:27:24.331 iops : min=35720, max=48372, avg=38755.68, stdev=797.67, samples=76 00:27:24.331 trim: IOPS=39.0k, BW=152MiB/s (160MB/s)(1523MiB/10001msec); 0 zone resets 00:27:24.331 slat (usec): min=5, max=386, avg=15.71, stdev= 6.45 00:27:24.331 clat (usec): min=52, max=1898, avg=245.03, stdev=103.91 00:27:24.331 lat (usec): min=62, max=1921, avg=260.74, stdev=105.71 00:27:24.331 clat percentiles (usec): 00:27:24.331 | 50.000th=[ 233], 99.000th=[ 529], 99.900th=[ 594], 99.990th=[ 652], 00:27:24.331 | 99.999th=[ 1549] 00:27:24.331 bw ( KiB/s): min=142880, max=193520, per=99.41%, avg=155024.00, stdev=3191.57, samples=76 00:27:24.331 iops : min=35720, max=48380, avg=38756.00, stdev=797.89, samples=76 00:27:24.331 lat (usec) : 50=0.01%, 100=7.23%, 250=50.70%, 500=36.67%, 750=4.77% 00:27:24.331 lat (usec) : 1000=0.63% 00:27:24.331 lat (msec) : 2=0.01%, 4=0.01% 00:27:24.331 cpu : usr=99.43%, sys=0.01%, ctx=63, majf=0, minf=121 00:27:24.331 IO depths : 1=7.7%, 2=26.4%, 4=52.7%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:27:24.331 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:24.331 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:27:24.331 issued rwts: total=0,389889,389889,0 short=0,0,0,0 dropped=0,0,0,0 00:27:24.331 latency : target=0, window=0, percentile=100.00%, depth=8 00:27:24.331 00:27:24.331 Run status group 0 (all jobs): 00:27:24.331 WRITE: bw=152MiB/s (160MB/s), 152MiB/s-152MiB/s (160MB/s-160MB/s), io=1523MiB (1597MB), run=10001-10001msec 00:27:24.331 TRIM: bw=152MiB/s (160MB/s), 152MiB/s-152MiB/s (160MB/s-160MB/s), io=1523MiB (1597MB), run=10001-10001msec 00:27:24.331 00:27:24.331 real 0m13.569s 00:27:24.331 user 0m43.160s 00:27:24.331 sys 0m0.546s 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:27:24.331 ************************************ 00:27:24.331 END TEST bdev_fio_trim 00:27:24.331 ************************************ 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:27:24.331 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:27:24.331 00:27:24.331 real 0m27.434s 00:27:24.331 user 1m26.410s 00:27:24.331 sys 0m1.227s 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:24.331 ************************************ 00:27:24.331 END TEST bdev_fio 00:27:24.331 ************************************ 00:27:24.331 04:28:12 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:24.331 04:28:12 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:24.331 04:28:12 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:27:24.331 04:28:12 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:24.331 04:28:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:24.331 ************************************ 00:27:24.331 START TEST bdev_verify 00:27:24.331 ************************************ 00:27:24.331 04:28:12 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:27:24.331 [2024-05-15 04:28:12.266152] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:24.331 [2024-05-15 04:28:12.266233] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3980141 ] 00:27:24.588 [2024-05-15 04:28:12.347183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:24.588 [2024-05-15 04:28:12.468097] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:24.588 [2024-05-15 04:28:12.468102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:24.588 [2024-05-15 04:28:12.489434] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:24.588 [2024-05-15 04:28:12.497463] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:24.588 [2024-05-15 04:28:12.505482] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:24.846 [2024-05-15 04:28:12.625273] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:27.373 [2024-05-15 04:28:15.016935] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:27.373 [2024-05-15 04:28:15.017014] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:27.373 [2024-05-15 04:28:15.017031] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:27.373 [2024-05-15 04:28:15.024950] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:27.373 [2024-05-15 04:28:15.024977] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:27.373 [2024-05-15 04:28:15.024990] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:27.373 [2024-05-15 04:28:15.032971] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:27.373 [2024-05-15 04:28:15.032996] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:27.373 [2024-05-15 04:28:15.033008] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:27.373 [2024-05-15 04:28:15.040992] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:27.373 [2024-05-15 04:28:15.041016] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:27.373 [2024-05-15 04:28:15.041028] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:27.373 Running I/O for 5 seconds... 00:27:32.633 00:27:32.633 Latency(us) 00:27:32.633 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:32.633 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x0 length 0x1000 00:27:32.633 crypto_ram : 5.06 594.71 2.32 0.00 0.00 214378.80 2961.26 119615.34 00:27:32.633 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x1000 length 0x1000 00:27:32.633 crypto_ram : 5.06 599.50 2.34 0.00 0.00 212802.92 3859.34 119615.34 00:27:32.633 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x0 length 0x1000 00:27:32.633 crypto_ram2 : 5.06 595.99 2.33 0.00 0.00 213555.82 4004.98 119615.34 00:27:32.633 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x1000 length 0x1000 00:27:32.633 crypto_ram2 : 5.06 600.80 2.35 0.00 0.00 211987.12 5097.24 118838.61 00:27:32.633 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x0 length 0x1000 00:27:32.633 crypto_ram3 : 5.04 4658.73 18.20 0.00 0.00 27281.98 2815.62 18252.99 00:27:32.633 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x1000 length 0x1000 00:27:32.633 crypto_ram3 : 5.04 4676.68 18.27 0.00 0.00 27192.93 5752.60 18155.90 00:27:32.633 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x0 length 0x1000 00:27:32.633 crypto_ram4 : 5.04 4668.09 18.23 0.00 0.00 27207.54 2330.17 18058.81 00:27:32.633 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:27:32.633 Verification LBA range: start 0x1000 length 0x1000 00:27:32.633 crypto_ram4 : 5.04 4682.76 18.29 0.00 0.00 27108.45 1468.49 18155.90 00:27:32.633 =================================================================================================================== 00:27:32.633 Total : 21077.26 82.33 0.00 0.00 48357.66 1468.49 119615.34 00:27:32.891 00:27:32.891 real 0m8.460s 00:27:32.891 user 0m15.929s 00:27:32.891 sys 0m0.507s 00:27:32.891 04:28:20 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:32.891 04:28:20 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:27:32.891 ************************************ 00:27:32.891 END TEST bdev_verify 00:27:32.891 ************************************ 00:27:32.891 04:28:20 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:32.891 04:28:20 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:27:32.891 04:28:20 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:32.891 04:28:20 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:32.891 ************************************ 00:27:32.891 START TEST bdev_verify_big_io 00:27:32.891 ************************************ 00:27:32.891 04:28:20 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:27:32.891 [2024-05-15 04:28:20.781959] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:32.891 [2024-05-15 04:28:20.782040] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3981089 ] 00:27:32.891 [2024-05-15 04:28:20.857686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:27:33.149 [2024-05-15 04:28:20.975873] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:33.149 [2024-05-15 04:28:20.975877] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:33.149 [2024-05-15 04:28:20.997159] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:33.149 [2024-05-15 04:28:21.005172] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:33.149 [2024-05-15 04:28:21.013191] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:33.149 [2024-05-15 04:28:21.126082] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:35.678 [2024-05-15 04:28:23.501847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:35.678 [2024-05-15 04:28:23.501928] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:35.678 [2024-05-15 04:28:23.501946] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:35.678 [2024-05-15 04:28:23.509862] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:35.678 [2024-05-15 04:28:23.509888] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:35.678 [2024-05-15 04:28:23.509900] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:35.678 [2024-05-15 04:28:23.517874] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:35.678 [2024-05-15 04:28:23.517905] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:35.678 [2024-05-15 04:28:23.517919] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:35.678 [2024-05-15 04:28:23.525894] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:35.678 [2024-05-15 04:28:23.525918] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:35.678 [2024-05-15 04:28:23.525931] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:35.678 Running I/O for 5 seconds... 00:27:38.959 [2024-05-15 04:28:26.255307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.257141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.258578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.258940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.261884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.263637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.264059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.266042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.268404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.270165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.270515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.270884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.273645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.274472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.276252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.277897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.280127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.280484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.280843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.281484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.283883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.285071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.286532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.288281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.289186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.289535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.289911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.291641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.293438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.294885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.296638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.298426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.299161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.299510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.301421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.303177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.305893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.307682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.309454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.310737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.311477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.312580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.313964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.315744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.318223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.320008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.321913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.322239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.323027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.324508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.326278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.328061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.330926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.332712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.333185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.333534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.335853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.337868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.339654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.341186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.344128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.344509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.344865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.345196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.347478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.349280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.350759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.351847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.353697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.354038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.354388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.356429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.358673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.360669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.361255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.362695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.364046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.364390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.365872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.367278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.369371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.369744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.371630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.373630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.375046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.375933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.377351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.379138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.380385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.382015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.383499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.385637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.388451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.390197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.391988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.393212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.394990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.396792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.398611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.399061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.402374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.404381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.404734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.406370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.408552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.408918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.409254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.409604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.412486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.414211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.414561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.414918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.416926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.418296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.418948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.420429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.422191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.423846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.424983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.425728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.427313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.427665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.428012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.429033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.431723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.432750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.432813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.433149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.435683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.437498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.437568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.437878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.438952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.439292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.439357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.440541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.959 [2024-05-15 04:28:26.440952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.441300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.441348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.443131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.444484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.445116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.445181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.446458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.446898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.448321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.448381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.449814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.451487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.453178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.453254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.454898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.455355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.456649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.456711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.457443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.458819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.460637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.460700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.461046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.461479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.463151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.463215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.463559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.465986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.466041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.467415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.467474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.468389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.468449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.468794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.468874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.470297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.470358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.471649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.471709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.472408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.472467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.472836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.472912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.475831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.475904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.477754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.477816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.478596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.478655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.479457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.479516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.481892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.481944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.482833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.482913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.483715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.483776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.485605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.485671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.488810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.488899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.489223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.489281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.490018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.490071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.491701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.491762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.493586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.493646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.493990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.494041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.494782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.494878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.495203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.495268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.496971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.497027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.497378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.497437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.498270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.498332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.498680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.498739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.500258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.500327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.500675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.500731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.501449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.501518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.501887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.501939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.503645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.503706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.504050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.504101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.504948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.505000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.505347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.505404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.506958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.507011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.507357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.507418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.507453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.507796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.508283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.508353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.508699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.508768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.508795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.509062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.510003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.510349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.510410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.510757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.511066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.511276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.511626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.511684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.512029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.512295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.513378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.513437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.513491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.513544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.513871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.514035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.514085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.514161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.514216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.514597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.515687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.515754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.515819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.515905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.516910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.517948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.517998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.518912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.519254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.520962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.521008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.521054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.521382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.522995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.523044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.523091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.523160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.523414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.524392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.524457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.524513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.524567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.524866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.525056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.525104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.960 [2024-05-15 04:28:26.525180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.525233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.525547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.526386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.526446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.526499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.526552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.526946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.527111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.527180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.527234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.527288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.527542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.528730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.528788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.528852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.528919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.529786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.530649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.530708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.530761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.530819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.531837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.532687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.532746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.532799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.532879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.533750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.534699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.534758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.534818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.534894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.535765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.536848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.536915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.536961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.537893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.692197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.694080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.696055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.698075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.700102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.701548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.702965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.704745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.706925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.708537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.709997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.711776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.714454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.716030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.717851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.719273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.721034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.722494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.724254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.724905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.728072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.730140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.732037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.733041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.734895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.736672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.737722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.738059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.740836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.742753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.743298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.744753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.746892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.748422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.748773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.749102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.752007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.752658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.754604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.756654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.759043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.759396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.759746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.760681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.763033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.764630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.766057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.767505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.768250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.768600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.768965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.770537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.773040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.774593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.776426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.776775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.778047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.779327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.780709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.781612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.783050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.784141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.785431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.786608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.788288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.789297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.789646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.789998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.792347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.793628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.793689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.794313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.795093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.797013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.797066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.799004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.800171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.800522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.800580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.800940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.801341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.802595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.802655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.803018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.804176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.804526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.804584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.805588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.806036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.807201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.807261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.809068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.810354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.810704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.810765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.812206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.812627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.813581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.813642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.961 [2024-05-15 04:28:26.814567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.815756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.817565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.817629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.819101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.819515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.821018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.821080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.822947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.824118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.825014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.825066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.826214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.826628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.827921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.827972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.828306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.829545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.831560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.831625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.832618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.833065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.834215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.834275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.834620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.835938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.836269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.836330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.837699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.838102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.838464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.838523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.838900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.840022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.841799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.841885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.843284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.843789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.844140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.844199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.845123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.846319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.847251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.847312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.848374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.848950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.849285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.849346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.850910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.852094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.853908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.853964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.854265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.854728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.855786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.855845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.857059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.858279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.859425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.859485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.859850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.860397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.860751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.860810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.861172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.862524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.862898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.862951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.863278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.863802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.864162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.864224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.864571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.865747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.866087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.866178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.866532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.867016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.867359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.867427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.867777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.869309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.869662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.869719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.870056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.870587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.870951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.871292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.871353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.872504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.872881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.872934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.873270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.874038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.874089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.874452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.874520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.876307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.876385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.876737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.877073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.877899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.878229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.878290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.878640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.879978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.880324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.880393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.880739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.881574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.881635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.881988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.882038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.883227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.883581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.883641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.883991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.884766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.884837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.885164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.885226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.886559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.886927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.886982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.887322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.888079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.888150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.888698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.888757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.889972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.891226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.891287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.892024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.892886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.892937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.894972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.895023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.896231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.898044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.898097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.898452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.898488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.898831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.900400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.900469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.901983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.902033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.902059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.902426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.903557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.903618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.903967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.904017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.904252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:38.962 [2024-05-15 04:28:26.904453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:38.962 [2024-05-15 04:28:26.905833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:38.962 [2024-05-15 04:28:26.905908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:38.962 [2024-05-15 04:28:26.906242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:27:38.962 [2024-05-15 04:28:26.907719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.907779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.907850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.907915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.908131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.908336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.908401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.908471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.908526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.909653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.909711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.909764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.909817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.910150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.910335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.910392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.910447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.910502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.911767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.911839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.911905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.911951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.912237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.912424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.912481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.912535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.912601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.913797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.962 [2024-05-15 04:28:26.913879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.913926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.913973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.914214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.914407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.914470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.914524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.914578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.915752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.915819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.915897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.915945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.916183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.916369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.916426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.916481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.916543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.917794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.917880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.917927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.917972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.918216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.918402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.918461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.918516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.918572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.919715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.919775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.919837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.919908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.920178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.920361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.920418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.920474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.920536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.921763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.921833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.921898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.921943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.922182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.922360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.922418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.922477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.922532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.923646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.923705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.923758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.923810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.924059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.924248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.924305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.924360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.927097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.927184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.927992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.928428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.929947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.930024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.930342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.931454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.932890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.932940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.934697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.935214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.936084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.936173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.937996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.939250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.939773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.939841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.941650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.942079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.943168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.943254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.943743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.944916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.946534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.946594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.946949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.947411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.949338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.949427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.950605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.951755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.952777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.952848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.954454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.954988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.956886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.956972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.958852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.960023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.960380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.960440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.961816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.962257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.962591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.962685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.964519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.965724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.967400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.967463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.967814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.968320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.970372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:38.963 [2024-05-15 04:28:26.970462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.222 [2024-05-15 04:28:26.971502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.222 [2024-05-15 04:28:26.972661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.222 [2024-05-15 04:28:26.973174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.973235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.974309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.974802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.976586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.976673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.977089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.978195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.978547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.978607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.980261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.980802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.982220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.982307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.983810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.984937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.986972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.987028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.988838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.989295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.991169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.991255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.991979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.995243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.996835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.996915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.998503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:26.998995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.000772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.000896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.002655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.003801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.004148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.004208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.006043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.006509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.008280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.008366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.009372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.010526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.012470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.012533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.014006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.014538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.016428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.016515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.016844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.021426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.022852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.022928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.024394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.024809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.024906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.025946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.025996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.026529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.026790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.029626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.031395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.031455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.032018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.032451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.032510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.033936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.033986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.034231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.036190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.037107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.037181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.037239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.039043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.039095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.040970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.041027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.041340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.045694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.045768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.045831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.046166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.046580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.048020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.048071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.048117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.048388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.051661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.053550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.053611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.053964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.056197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.056261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.058167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.058230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.058488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.062812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.064619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.223 [2024-05-15 04:28:27.064679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.065427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.066632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.066693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.067429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.067490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.067747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.071883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.073768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.073837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.075428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.077314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.077375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.079343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.079403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.079750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.084113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.085917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.085968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.087003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.088962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.089013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.090782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.090852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.091090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.094241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.095911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.095962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.096007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.098317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.098383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.099978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.101202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.101495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.105220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.105764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.105831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.106902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.108673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.108734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.110134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.110357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.114290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.116085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.116891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.118322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.118786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.119464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.120637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.122051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.122363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.127775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.129561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.130299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.131641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.132995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.133884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.135312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.136774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.137031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.142524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.143332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.144971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.145636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.146444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.147932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.149540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.151328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.151595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.155999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.157931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.158363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.159885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.161959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.163798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.164125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.165818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.166077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.168090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.168895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.169886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.170831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.172210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.173128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.174674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.176387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.176781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.179725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.180972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.181664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.183555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.184643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.186277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.186940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.188231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.188549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.193247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.224 [2024-05-15 04:28:27.193923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.195170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.196226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.197434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.198553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.200328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.201901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.202147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.206064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.208086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.208456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.209965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.211080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.212567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.213843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.214627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.214913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.217732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.218290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.219891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.221293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.223132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.224023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.225047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.225472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.225743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.230988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.231041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.231390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.231450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.233150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.233236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.233846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.233916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.225 [2024-05-15 04:28:27.234154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.239490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.239551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.240053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.240105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.241835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.241904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.243280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.243340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.243649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.245965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.246021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.247967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.248028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.248891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.248944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.250193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.250254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.250574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.257451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.257517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.259285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.259374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.261184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.261258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.262639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.262699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.262966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.265724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.265787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.266285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.266350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.268562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.268629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.269874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.269926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.270299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.275019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.275071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.276538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.276598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.277541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.277602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.278637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.278696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.278966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.283887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.283947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.284939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.284990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.287127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.287206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.287553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.287612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.287894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.292528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.292591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.294069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.294138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.295290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.295351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.296755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.296815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.297136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.301044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.301096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.301451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.301511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.302285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.302351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.303913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.303964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.304364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.308676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.308739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.309351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.309410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.310097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.310164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.310514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.310574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.310917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.315095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.315176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.315523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.315591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.316802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.316883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.318794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.318878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.486 [2024-05-15 04:28:27.319205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.322440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.322502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.322885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.322935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.323683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.323750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.324978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.325034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.325433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.326806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.326889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.327228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.327290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.328001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.328058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.329798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.329877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.330150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.334062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.334133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.334481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.334540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.335326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.336582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.336641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.337003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.337259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.340247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.340307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.341514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.341574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.342096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.342462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.342814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.342899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.343184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.345850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.345923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.346463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.347878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.349208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.349270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.349621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.349686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.349991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.351954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.352299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.352660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.352723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.354654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.355454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.355515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.356297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.356595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.362158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.362219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.363441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.363526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.365294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.365356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.366640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.366701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.367022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.371030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.371083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.371998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.372049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.373717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.373779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.374847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.374913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.375200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.381278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.381338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.381686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.381747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.383889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.383941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.384792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.384872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.385134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.387751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.387819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.389685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.389748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.390464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.390528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.392078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.392132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.392407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.396765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.396834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.397924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.398407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.487 [2024-05-15 04:28:27.400381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.400443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.400789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.400872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.401098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.405583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.405660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.407610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.407677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.408140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.408201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.408547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.408607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.408890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.413279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.413351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.413410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.413465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.415378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.415438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.415497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.415550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.415895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.419790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.419865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.419922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.419969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.420385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.420449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.420504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.420559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.420818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.424553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.424614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.424670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.424723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.425182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.425242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.425296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.425348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.425667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.429367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.429426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.429480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.429534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.429969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.430019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.430064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.430109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.430400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.432938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.432988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.433726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.434095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.439779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.440033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.442796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.443045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.446856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.446928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.446974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.447986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.451625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.451692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.451756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.451811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.452213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.452281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.452337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.452390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.452655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.454651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.488 [2024-05-15 04:28:27.456389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.456456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.457747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.458230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.459054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.459107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.460737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.461055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.465159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.466874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.466952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.468539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.468979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.470502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.470562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.471475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.471777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.474380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.475739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.475799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.477737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.478186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.479579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.479639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.479988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.480258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.482974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.484375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.484435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.486074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.486619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.487544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.487604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.489322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.489730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.493629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.494007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.494060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.495408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.495836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.497687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.497747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.498372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.489 [2024-05-15 04:28:27.498641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.500370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.501839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.501907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.503683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.504135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.506063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.506122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.507901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.508184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.511590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.511954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.512008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.513924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.514318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.516126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.516188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.517368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.517637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.521797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.522960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.523010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.524434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.524966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.526753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.526817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.528871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.529101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.533455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.535247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.535307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.535999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.536407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.537372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.537431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.538059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.538338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.542445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.544413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.544478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.546222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.546640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.547919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.547976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.549903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.550257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.554585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.556387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.556448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.557398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.557837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.559745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.559808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.561628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.561915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.566390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.568288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.568352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.570025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.570449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.571412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.571500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.573151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.573420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.577299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.749 [2024-05-15 04:28:27.579098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.579153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.579506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.579958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.581621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.581682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.583471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.583753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.587481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.588139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.588206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.589128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.589551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.589613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.589964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.590021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.590288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.595112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.596600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.596660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.598077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.599109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.599179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.599234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.600042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.600319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.603286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.605088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.605161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.605226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.605660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.607070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.607122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.608584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.608873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.611294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.611357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.611410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.612874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.613319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.613377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.615329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.615392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.615678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.619663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.621484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.621550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.622352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.622772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.623841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.623919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.625346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.625637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.630153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.632183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.632249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.633794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.634192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.634544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.634604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.636166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.636495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.640911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.642104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.642171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.643607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.644066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.646022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.646074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.647427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.647786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.651310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.652779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.652846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.654642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.655122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.656571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.656631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.658040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.658303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.660343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.660716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.660775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.660838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.661272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.662563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.662633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.663107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.663389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.666739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.667689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.667749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.668374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.670043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.671585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.750 [2024-05-15 04:28:27.671645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.672735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.673039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.675453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.675840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.677841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.679509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.679985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.681596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.683497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.684791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.685154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.689799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.690448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.691744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.693304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.694055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.695849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.696176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.697278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.697593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.702549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.702948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.704836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.705171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.706613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.707448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.709278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.710445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.710748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.718176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.719493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.720082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.722051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.723118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.724925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.725341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.726861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.727159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.731730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.732175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.733640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.734456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.735514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.736778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.738628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.739469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.739765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.743967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.745171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.746039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.747483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.748769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.750093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.751735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.752069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.752346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.755816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.756753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.757880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.759786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.762289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:39.751 [2024-05-15 04:28:27.762638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.764697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.766192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.766524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.770547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.771958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.772304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.773898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.774634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.776185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.778071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.778865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.779107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.783257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.784031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.785178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.786979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.788048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.789658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.790009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.791476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.791811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.796402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.796467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.796812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.796881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.797574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.797640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.799309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.799375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.799689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.805089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.805166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.011 [2024-05-15 04:28:27.806245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.806306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.808531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.808592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.809078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.809151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.809415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.813301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.813364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.813986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.814036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.815752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.815815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.816574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.816633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.816943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.822741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.822802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.823135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.823203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.823973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.824024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.825381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.825441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.825707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.829677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.829738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.830408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.830469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.831215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.831287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.831636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.831701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.832021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.835996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.836061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.836417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.836478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.837372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.837439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.838478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.838538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.838878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.840786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.840872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.841196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.841265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.842011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.842063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.842425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.842504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.842876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.844963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.845020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.845377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.845439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.846145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.846224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.846576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.846640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.846993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.849075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.849146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.849494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.849555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.850265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.850331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.850681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.850742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.851063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.853026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.853078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.853432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.853506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.854258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.854321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.854667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.854728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.855010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.857660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.857722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.858647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.858709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.012 [2024-05-15 04:28:27.861020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.861072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.862699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.862759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.863084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.866786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.866856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.868258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.868325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.869020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.869074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.869436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.869498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.869761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.874006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.874057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.874407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.874478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.876821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.876899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.878444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.878504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.878832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.882084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.882155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.883650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.883741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.885067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.886013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.886065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.887428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.887807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.890931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.890982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.892216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.892277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.892729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.893061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.893419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.893483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.893746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.899041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.899093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.899947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.901317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.902618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.902680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.903946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.904000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.904318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.907612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.908426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.909520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.909580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.911402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.913136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.913196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.914088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.914385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.918294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.918357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.919344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.919404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.921183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.921244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.921589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.921647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.921967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.926798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.926892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.928534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.928594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.929436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.929497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.931058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.931126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.931392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.934347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.934414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.934760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.934830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.936985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.937035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.937850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.013 [2024-05-15 04:28:27.937916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.938186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.941950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.942002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.942740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.942805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.944748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.944810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.945144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.945204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.945538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.950217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.950279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.951615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.951968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.954112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.954192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.956069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.956138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.956405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.959849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.959916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.960238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.960298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.960782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.960869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.961356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.961421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.961737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.964928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.964979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.965024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.965072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.965899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.965950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.965994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.966047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.966360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.969928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.969978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.970793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.971041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.974952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.975987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.977758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.978017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.978946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.978997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.979998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.980975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.014 [2024-05-15 04:28:27.981671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.981727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.981994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.982931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.982981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.983967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.984963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.985675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.986028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.986963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.987735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.988005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.988944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.990705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.990765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.992784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.993347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.994789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.994856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.995578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.995909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.996794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.997618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.997680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.999192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:27.999608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.001036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.001103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.002136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.002516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.003505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.004966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.005017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.006440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.006914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.008457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.008518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.009933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.010179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.011335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.011735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.011794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.013172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.013599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.015016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.015065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.015962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.016189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.017102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.018735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.018799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.019152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.019633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.021295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.021357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.022847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.023144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.015 [2024-05-15 04:28:28.024079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.026071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.026162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.027877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.028276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.028632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.028691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.029038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.029318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.030240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.031777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.031848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.033916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.034335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.036365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.036425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.038253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.038635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.039722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.041589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.041651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.043306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.043721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.045578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.045641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.047696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.047970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.048968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.049314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.049373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.051346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.051759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.053655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.053718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.055349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.055617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.056555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.058104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.058172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.058519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.059000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.060538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.060598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.062279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.062607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.066719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.068265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.276 [2024-05-15 04:28:28.068326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.069635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.070150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.070518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.070578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.071982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.072258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.075920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.077348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.077408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.078957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.079414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.079763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.079821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.081270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.081597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.082445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.083711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.083771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.084093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.084599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.085325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.085386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.086284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.086554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.087481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.088435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.088503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.088871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.089363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.089422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.090531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.090593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.090890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.091791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.092563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.092624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.094057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.095654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.095714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.095769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.096989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.097324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.098173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.099191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.099252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.099311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.099867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.100185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.100244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.101343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.101672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.104020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.104081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.104138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.105728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.106161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.106224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.106571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.106630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.107037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.108000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.108794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.108866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.110798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.111255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.112545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.112610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.112961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.113215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.114380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.115568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.115629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.116037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.116449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.118337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.118403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.118748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.119096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.120039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.121130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.121184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.123008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.123477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.123953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.124003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.124337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.124614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.125474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.126334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.126394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.127324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.277 [2024-05-15 04:28:28.127736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.128072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.128122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.128487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.128754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.129685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.131295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.131359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.131416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.131816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.132143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.132218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.132563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.132840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.133786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.135560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.135625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.137688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.138471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.139106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.139160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.140079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.140357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.141270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.142414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.142764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.143093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.143557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.144902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.145461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.147403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.147726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.149014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.151068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.152525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.153056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.154847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.155288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.155637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.155990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.156245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.158282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.160172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.160935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.162053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.163026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.164280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.164629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.164976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.165251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.170228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.171669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.172391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.172739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.174118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.175945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.176536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.177044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.177311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.178538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.180154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.180506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.181742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.182507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.182879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.183186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.183549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.183900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.185186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.185540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.185898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.186248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.187008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.187354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.187704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.188044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.188389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.189640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.189999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.190345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.190696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.191507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.191869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.192205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.192556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.192922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.194106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.278 [2024-05-15 04:28:28.194465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.194814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.195157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.195927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.196249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.196600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.196966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.197304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.198994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.199338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.199688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.200034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.200786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.201115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.201484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.201846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.202170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.203405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.203467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.203943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.203995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.204977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.205029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.206005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.206068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.206444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.209118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.209197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.209815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.209897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.211583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.211645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.211993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.212043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.212408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.213604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.213666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.214991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.215043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.215749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.215809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.216130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.216206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.216470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.219332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.219382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.221040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.221093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.221877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.221945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.223080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.223146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.223480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.225221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.225283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.225924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.225974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.226737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.226810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.228484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.228549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.228811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.231183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.231244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.231593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.231658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.233791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.233863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.235572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.235638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.235947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.239321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.239389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.240669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.240731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.242166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.242229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.243520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.243580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.243957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.246820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.246913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.248745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.248806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.250466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.250529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.251516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.251575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.279 [2024-05-15 04:28:28.251944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.254566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.254632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.254985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.255037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.257020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.257070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.257427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.257495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.257879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.262456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.262519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.262937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.262987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.263718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.263779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.265095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.265156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.265432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.267415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.267477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.267831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.267905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.270238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.270306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.272242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.272303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.272599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.273719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.273780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.274098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.274172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.275972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.276027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.276367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.276428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.276709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.277910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.277961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.279224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.279284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.280257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.280323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.282355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.282418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.282689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.284584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.284646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.285560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.285620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.286579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.288493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.280 [2024-05-15 04:28:28.288561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.290504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.290776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.292085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.292163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.293223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.293286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.293764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.294333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.294692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.294762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.295104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.297880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.297936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.541 [2024-05-15 04:28:28.298325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.299902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.301610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.301671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.302008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.302058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.302403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.303410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.303761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.304486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.304552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.305782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.307591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.307654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.309683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.310033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.312107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.312184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.313737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.313798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.314643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.314704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.315986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.316036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.316314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.317496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.317557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.317979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.318030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.319844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.319911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.321674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.321733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.322072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.324862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.324930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.326296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.326356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.327208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.327272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.328755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.328815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.329116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.331706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.331771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.333766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.333836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.335836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.335909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.336227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.336298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.336674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.339489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.339552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.341007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.342488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.344333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.344394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.346179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.346244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.346625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.348876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.348942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.350361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.350421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.350849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.350916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.352334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.352394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.352689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.353905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.353962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.354007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.354051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.356432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.356496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.356554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.356607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.356914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.357750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.357813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.357890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.357936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.358374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.358433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.358493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.358547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.358809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.359694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.359754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.359808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.359888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.360286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.360345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.360398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.542 [2024-05-15 04:28:28.360456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.360718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.361565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.361625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.361680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.361733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.362145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.362215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.362269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.362322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.362611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.363537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.363598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.363652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.363715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.364258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.364317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.364371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.364424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.364736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.365620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.365691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.365745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.365799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.366205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.366266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.366321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.366374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.366671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.367526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.367586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.367640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.367693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.368185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.368245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.368308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.368362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.368660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.369514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.369573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.369626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.369681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.370122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.370198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.370252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.370308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.370573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.371423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.371483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.371537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.371591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.372050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.372099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.372164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.372218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.372481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.373403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.375202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.375262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.376686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.377148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.378748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.378808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.380807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.381174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.382043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.383454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.383519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.385146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.385559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.386437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.386497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.387901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.388193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.389074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.389432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.389492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.390313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.390761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.392147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.392221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.393987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.394323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.395171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.396961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.397011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.398015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.398581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.398944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.398995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.400246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.400516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.401398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.402780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.402848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.543 [2024-05-15 04:28:28.403175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.403724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.405560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.405631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.407703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.408043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.408924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.409246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.409306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.409660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.410068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.411609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.411670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.412012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.412264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.413201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.413552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.413612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.415064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.415552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.416192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.416253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.418063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.418359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.419320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.420168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.420236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.421151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.421576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.422210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.422273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.423952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.424206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.425118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.425489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.425549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.426838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.427298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.427784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.427856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.429871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.430096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.431073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.431871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.431924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.433136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.433613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.435157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.435225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.436856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.437166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.438091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.439541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.439603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.440976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.441412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.442672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.442733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.443388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.443753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.444685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.446706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.446766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.447285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.447724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.448969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.449019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.449360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.449776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.450699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.451039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.451091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.452911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.453294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.453644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.453703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.454044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.454336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.455222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.456775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.456843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.458341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.458896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.458945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.459273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.459334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.459624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.460490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.461967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.462017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.463531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.464333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.464394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.464448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.465002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.465287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.544 [2024-05-15 04:28:28.466146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.467429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.467491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.467551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.468027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.468371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.468431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.469050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.469346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.471423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.471485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.471539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.472364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.472922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.472971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.473302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.473363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.473670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.474550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.474914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.474965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.475295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.475709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.476797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.476879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.477289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.477555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.478508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.478880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.478934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.479680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.480127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.481941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.481991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.482327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.482596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.483429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.483779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.483847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.484179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.484658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.485694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.485754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.486577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.486913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.488201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.489943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.489994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.491537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.491953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.493011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.493061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.495000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.495268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.496126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.496490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.496549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.496604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.497109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.499119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.499195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.501219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.501542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.502377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.502727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.502787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.503121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.503917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.504246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.504306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.504650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.504932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.505904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.506229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.506583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.506940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.507439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.507788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.508109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.508473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.508772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.509976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.510314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.510665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.511006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.511867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.512188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.512540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.512903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.513158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.514422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.514773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.515097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.515462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.516223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.516575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.545 [2024-05-15 04:28:28.516936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.517270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.517686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.519130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.519493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.519850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.520183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.520947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.521281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.521630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.522521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.522835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.524002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.524356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.525584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.526849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.528476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.529755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.530672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.531020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.531386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.532688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.534519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.536033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.536422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.537162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.539078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.541022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.541366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.541636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.542850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.543253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.544784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.546560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.548345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.549992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.550327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.550679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.550993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.553831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.554187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.546 [2024-05-15 04:28:28.554536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.555031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.557466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.558026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.559381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.561349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.561693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.563458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.565136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.566162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.567098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.567818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.568132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.569461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.570427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.570702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.572657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.572731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.573060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.573116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.575397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.575463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.577470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.577536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.577845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.578990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.579042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.579387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.579449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.580734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.580797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.581474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.581534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.581798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.583031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.583083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.583513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.583574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.807 [2024-05-15 04:28:28.585694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.585757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.587063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.587113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.587465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.588719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.588781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.590797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.590863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.591592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.591653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.592895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.592947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.593192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.595059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.595130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.596041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.596092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.598392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.598456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.600019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.600070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.600396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.602854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.602926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.604775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.604845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.606106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.606169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.607250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.607320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.607728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.610746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.610811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.612758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.612821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.614626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.614688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.615947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.615997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.616299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.619416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.619468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.621219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.621282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.622954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.623006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.624571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.624632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.624996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.627696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.627758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.628655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.628715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.630929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.630980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.631317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.631378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.631665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.632990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.633041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.634026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.634080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.634885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.634936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.635262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.635323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.635587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.636810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.636905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.637232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.637293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.638371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.638432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.808 [2024-05-15 04:28:28.639280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.639344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.639661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.642447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.642512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.642881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.642933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.644812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.644893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.645219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.645279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.645630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.648394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.648470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.650073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.650144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.652407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.652480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.654523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.654588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.654860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.657789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.657885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.659940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.660002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.662313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.663651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.663711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.665194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.665492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.666695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.666758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.667199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.667259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.667719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.669193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.670658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.670718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.671028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.673914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.673981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.675723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.676056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.677999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.678049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.679706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.679768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.680029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.680936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.682353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.683780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.683851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.684920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.685247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.685307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.685941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.686220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.688739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.688799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.690543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.690606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.692881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.692943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.694937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.694997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.695372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.697687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.697748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.699301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.699360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.700533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.700594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.701985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.702034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.702340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.703504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.703569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.703924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.703974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.705831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.705900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.707309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.707369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.707689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.710335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.710398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.712384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.712444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.713260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.713321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.715207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.715290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.715554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.809 [2024-05-15 04:28:28.716989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.717039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.718526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.719949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.721618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.721683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.722018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.722069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.722409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.725184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.725255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.727022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.727082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.727512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.727571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.729328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.729400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.729664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.730819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.730897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.730941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.730986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.733190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.733263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.733322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.733377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.733686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.734540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.734600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.734654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.734707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.735147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.735206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.735262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.735323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.735586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.736483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.736542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.736596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.736651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.737061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.737110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.737173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.737227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.737508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.738412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.738472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.738525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.738579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.739003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.739055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.739114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.739169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.739497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.740659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.740718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.740771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.740833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.741270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.741328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.741387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.741440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.741728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.742606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.742674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.742728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.742781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.743226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.743285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.743338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.743392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.743774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.744680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.744739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.744793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.744854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.745277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.745335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.745388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.745441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.745798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.746776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.746843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.746907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.746952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.747407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.747466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.747523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.747584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.810 [2024-05-15 04:28:28.747873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.748730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.748789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.748851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.748913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.749316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.749374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.749427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.749480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.749911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.750840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.751513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.751578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.753432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.753949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.754250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.754300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.754598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.754867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.755599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.756497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.756570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.757973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.758452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.758803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.758893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.759366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.759634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.760542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.761623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.761694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.763005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.763551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.763916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.763968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.765741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.766026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.766987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.769047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.769097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.769471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.769993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.771630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.771691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.772984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.773312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.774221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.774574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.774635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.774985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.775449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.776516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.776577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.777949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.778223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.779125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.779488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.779563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.779929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.780338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.782069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.782140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.782711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.782990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.783971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.784316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.784378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.785990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.786481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.787035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.787086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.788892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.789150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.790204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.790703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.790763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.791890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.792288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.793554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.793614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.794519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.794831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.811 [2024-05-15 04:28:28.795831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.797795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.797876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.799468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.799971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.801217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.801291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.802933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.803276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.804144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.805093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.805179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.805990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.806404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.807980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.808042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.808388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.808714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.809649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.811702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.811777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.813175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.813711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.814750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.814813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.815150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.815519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.816540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.818436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.818510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.818890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:40.812 [2024-05-15 04:28:28.819358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.820619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.820689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.822158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.822475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.823484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.824748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.824808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.825840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.826379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.826747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.826805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.827847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.828149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.829133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.830708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.830768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.831127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.831563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.831621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.833068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.833144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.833375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.834341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.835422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.835480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.836938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.838787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.838878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.838926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.839252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.839580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.840570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.842617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.842677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.842749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.843180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.844973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.845024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.846148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.846504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.847758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.847819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.847890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.849928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.850382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.850444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.850746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.850794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.851113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.852148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.852479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.852528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.852839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.853288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.073 [2024-05-15 04:28:28.853607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.853655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.853968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.854336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.855316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.855632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.855681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.856022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.856463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.856780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.856855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.857170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.857470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.858443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.858759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.858835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.859182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.859713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.860062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.860113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.860469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.860821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.861792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.862142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.862213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.862499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.862942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.863286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.863348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.863652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.864015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.865256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.865572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.865619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.865668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.866165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.866480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.866529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.866835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.867133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.868010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.868341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.868411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.869456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.870928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.872040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.872090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.873324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.873652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.875031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.876687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.878017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.878527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.878984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.880678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.881021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.881326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.881614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.882888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.884659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.886048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.886561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.887351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.888871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.890137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.890794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.891095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.892376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.892691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.074 [2024-05-15 04:28:28.894355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.894668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.895656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.895984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.896271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.897423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.897736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.899483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.900421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.900743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.901036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.902279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.903412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.904796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.905734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.905997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.908500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.909386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.910498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.911937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.913416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.913730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.914023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.915257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.915566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.917398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.918306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.918643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.918939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.920180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.921323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.922720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.923655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.923956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.927207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.928502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.929035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.930986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.931909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.932197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.932483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.934265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.934556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.937009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.937299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.937585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.937876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.939567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.940050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.942032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.943328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.943686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.946514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.948139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.948454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.950224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.950944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.951232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.951516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.953382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.953675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.955631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.955693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.957437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.957500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.958254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.958317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.958618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.075 [2024-05-15 04:28:28.958669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.958894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.961527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.961590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.963063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.963112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.963893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.963942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.965509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.965567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.965878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.967106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.967166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.967470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.967518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.969725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.969787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.971797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.971866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.972132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.973988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.974044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.974327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.974378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.975118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.975168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.975451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.975509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.975859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.978208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.978268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.980001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.980050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.982377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.982447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.984171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.984242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.984503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.985785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.985852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.986176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.986229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.987032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.987082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.988589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.988649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.988893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.991771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.991849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.992175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.992224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.993804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.993871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.995107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.995179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.995436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.997592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.997657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.999179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.999237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:28.999977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.000027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.000311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.000359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.000579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.003174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.003237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.004966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.005017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.007187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.007254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.076 [2024-05-15 04:28:29.008935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.008984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.009292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.011542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.011602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.013020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.013068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.014167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.014227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.015628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.015686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.015978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.017200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.017261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.017930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.017978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.019675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.019746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.021494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.021553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.021890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.024492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.024553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.025493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.025557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.026377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.026438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.028025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.028111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.028444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.031348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.031410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.033292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.033354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.035158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.035218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.035564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.035628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.036021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.038850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.038920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.040566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.040624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.042426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.043872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.043922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.045699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.046032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.048363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.048423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.049863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.049913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.050333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.050879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.052298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.052357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.052668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.053928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.053999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.054342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.056229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.058595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.058658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.060231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.060289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.060551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.061570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.077 [2024-05-15 04:28:29.063411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.063760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.063819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.065456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.066901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.066965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.068398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.068665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.071101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.071165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.072969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.073037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.073819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.073896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.074230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.074288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.074552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.076655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.076715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.078639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.078700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.081051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.081119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.082718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.082776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.083214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.085523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.085584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.086267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.078 [2024-05-15 04:28:29.086336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.088406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.088468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.088815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.088894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.089174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.091306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.091367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.092589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.092655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.093523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.093581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.093936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.094015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.336 [2024-05-15 04:28:29.094296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:27:41.595 00:27:41.595 Latency(us) 00:27:41.595 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:41.595 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x0 length 0x100 00:27:41.595 crypto_ram : 5.84 43.81 2.74 0.00 0.00 2840339.34 59807.67 2709209.69 00:27:41.595 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x100 length 0x100 00:27:41.595 crypto_ram : 5.81 44.08 2.75 0.00 0.00 2816201.77 68351.62 2622216.72 00:27:41.595 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x0 length 0x100 00:27:41.595 crypto_ram2 : 5.84 43.81 2.74 0.00 0.00 2732845.13 59419.31 2709209.69 00:27:41.595 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x100 length 0x100 00:27:41.595 crypto_ram2 : 5.81 44.07 2.75 0.00 0.00 2710318.65 67963.26 2547651.32 00:27:41.595 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x0 length 0x100 00:27:41.595 crypto_ram3 : 5.60 271.72 16.98 0.00 0.00 418539.44 59030.95 593416.34 00:27:41.595 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x100 length 0x100 00:27:41.595 crypto_ram3 : 5.58 287.97 18.00 0.00 0.00 395682.99 29903.83 593416.34 00:27:41.595 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x0 length 0x100 00:27:41.595 crypto_ram4 : 5.71 290.25 18.14 0.00 0.00 380299.04 1723.35 475354.45 00:27:41.595 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:27:41.595 Verification LBA range: start 0x100 length 0x100 00:27:41.595 crypto_ram4 : 5.71 307.06 19.19 0.00 0.00 359940.52 12913.02 515744.05 00:27:41.595 =================================================================================================================== 00:27:41.595 Total : 1332.77 83.30 0.00 0.00 710777.37 1723.35 2709209.69 00:27:42.160 00:27:42.160 real 0m9.243s 00:27:42.160 user 0m17.434s 00:27:42.160 sys 0m0.559s 00:27:42.160 04:28:29 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:42.160 04:28:29 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:27:42.160 ************************************ 00:27:42.160 END TEST bdev_verify_big_io 00:27:42.160 ************************************ 00:27:42.160 04:28:29 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:42.160 04:28:29 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:27:42.160 04:28:29 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:42.160 04:28:29 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:42.160 ************************************ 00:27:42.160 START TEST bdev_write_zeroes 00:27:42.160 ************************************ 00:27:42.160 04:28:30 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:42.161 [2024-05-15 04:28:30.076482] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:42.161 [2024-05-15 04:28:30.076571] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3982156 ] 00:27:42.161 [2024-05-15 04:28:30.159694] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:42.418 [2024-05-15 04:28:30.282089] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:42.418 [2024-05-15 04:28:30.303464] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:27:42.418 [2024-05-15 04:28:30.311480] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:27:42.418 [2024-05-15 04:28:30.319528] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:27:42.676 [2024-05-15 04:28:30.439301] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:27:45.203 [2024-05-15 04:28:32.852606] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:27:45.203 [2024-05-15 04:28:32.852689] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:45.203 [2024-05-15 04:28:32.852724] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:45.203 [2024-05-15 04:28:32.860624] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:27:45.203 [2024-05-15 04:28:32.860654] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:45.203 [2024-05-15 04:28:32.860669] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:45.203 [2024-05-15 04:28:32.868645] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:27:45.203 [2024-05-15 04:28:32.868674] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:27:45.203 [2024-05-15 04:28:32.868689] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:45.203 [2024-05-15 04:28:32.876684] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:27:45.203 [2024-05-15 04:28:32.876713] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:27:45.203 [2024-05-15 04:28:32.876727] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:45.203 Running I/O for 1 seconds... 00:27:46.135 00:27:46.135 Latency(us) 00:27:46.135 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:46.135 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:46.135 crypto_ram : 1.03 1925.45 7.52 0.00 0.00 65947.02 5534.15 78837.38 00:27:46.135 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:46.135 crypto_ram2 : 1.03 1938.89 7.57 0.00 0.00 65197.95 5461.33 73400.32 00:27:46.135 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:46.135 crypto_ram3 : 1.02 14793.38 57.79 0.00 0.00 8519.88 2609.30 11165.39 00:27:46.135 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:27:46.135 crypto_ram4 : 1.02 14831.71 57.94 0.00 0.00 8470.59 2621.44 8883.77 00:27:46.135 =================================================================================================================== 00:27:46.135 Total : 33489.43 130.82 0.00 0.00 15109.33 2609.30 78837.38 00:27:46.763 00:27:46.763 real 0m4.460s 00:27:46.763 user 0m3.898s 00:27:46.763 sys 0m0.512s 00:27:46.763 04:28:34 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:46.763 04:28:34 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:27:46.763 ************************************ 00:27:46.763 END TEST bdev_write_zeroes 00:27:46.763 ************************************ 00:27:46.763 04:28:34 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:46.763 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:27:46.763 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:46.763 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:46.763 ************************************ 00:27:46.763 START TEST bdev_json_nonenclosed 00:27:46.763 ************************************ 00:27:46.763 04:28:34 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:46.763 [2024-05-15 04:28:34.589078] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:46.763 [2024-05-15 04:28:34.589153] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3982699 ] 00:27:46.763 [2024-05-15 04:28:34.672026] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:47.041 [2024-05-15 04:28:34.790725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:47.041 [2024-05-15 04:28:34.790821] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:27:47.041 [2024-05-15 04:28:34.790869] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:47.041 [2024-05-15 04:28:34.790883] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:47.041 00:27:47.041 real 0m0.387s 00:27:47.041 user 0m0.270s 00:27:47.041 sys 0m0.114s 00:27:47.041 04:28:34 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:47.041 04:28:34 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:27:47.041 ************************************ 00:27:47.041 END TEST bdev_json_nonenclosed 00:27:47.041 ************************************ 00:27:47.041 04:28:34 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:47.041 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:27:47.041 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:47.041 04:28:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:47.041 ************************************ 00:27:47.041 START TEST bdev_json_nonarray 00:27:47.041 ************************************ 00:27:47.041 04:28:34 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:27:47.041 [2024-05-15 04:28:35.022170] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:47.041 [2024-05-15 04:28:35.022240] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3982844 ] 00:27:47.299 [2024-05-15 04:28:35.101112] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:47.299 [2024-05-15 04:28:35.217411] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:47.299 [2024-05-15 04:28:35.217536] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:27:47.299 [2024-05-15 04:28:35.217560] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:27:47.299 [2024-05-15 04:28:35.217573] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:27:47.557 00:27:47.557 real 0m0.375s 00:27:47.557 user 0m0.275s 00:27:47.557 sys 0m0.098s 00:27:47.557 04:28:35 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:47.557 04:28:35 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:27:47.557 ************************************ 00:27:47.557 END TEST bdev_json_nonarray 00:27:47.557 ************************************ 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:27:47.557 04:28:35 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:27:47.557 00:27:47.557 real 1m13.371s 00:27:47.557 user 2m36.637s 00:27:47.557 sys 0m9.142s 00:27:47.557 04:28:35 blockdev_crypto_aesni -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:47.557 04:28:35 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:27:47.557 ************************************ 00:27:47.557 END TEST blockdev_crypto_aesni 00:27:47.557 ************************************ 00:27:47.557 04:28:35 -- spdk/autotest.sh@354 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:27:47.557 04:28:35 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:47.557 04:28:35 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:47.557 04:28:35 -- common/autotest_common.sh@10 -- # set +x 00:27:47.557 ************************************ 00:27:47.557 START TEST blockdev_crypto_sw 00:27:47.557 ************************************ 00:27:47.557 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:27:47.557 * Looking for test storage... 00:27:47.557 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3982909 00:27:47.557 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:27:47.558 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:27:47.558 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 3982909 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@827 -- # '[' -z 3982909 ']' 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:47.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:47.558 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:47.558 [2024-05-15 04:28:35.524924] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:47.558 [2024-05-15 04:28:35.524998] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3982909 ] 00:27:47.816 [2024-05-15 04:28:35.609712] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:47.816 [2024-05-15 04:28:35.730660] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:47.816 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:47.816 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # return 0 00:27:47.816 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:27:47.816 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:27:47.816 04:28:35 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:27:47.816 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:47.816 04:28:35 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.073 Malloc0 00:27:48.073 Malloc1 00:27:48.073 true 00:27:48.073 true 00:27:48.073 true 00:27:48.073 [2024-05-15 04:28:36.076161] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:48.073 crypto_ram 00:27:48.073 [2024-05-15 04:28:36.084192] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:48.073 crypto_ram2 00:27:48.331 [2024-05-15 04:28:36.092226] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:48.331 crypto_ram3 00:27:48.331 [ 00:27:48.331 { 00:27:48.331 "name": "Malloc1", 00:27:48.331 "aliases": [ 00:27:48.331 "39f20261-00bf-4635-9f6c-230348040a27" 00:27:48.331 ], 00:27:48.331 "product_name": "Malloc disk", 00:27:48.331 "block_size": 4096, 00:27:48.331 "num_blocks": 4096, 00:27:48.331 "uuid": "39f20261-00bf-4635-9f6c-230348040a27", 00:27:48.331 "assigned_rate_limits": { 00:27:48.331 "rw_ios_per_sec": 0, 00:27:48.331 "rw_mbytes_per_sec": 0, 00:27:48.331 "r_mbytes_per_sec": 0, 00:27:48.331 "w_mbytes_per_sec": 0 00:27:48.331 }, 00:27:48.331 "claimed": true, 00:27:48.331 "claim_type": "exclusive_write", 00:27:48.331 "zoned": false, 00:27:48.331 "supported_io_types": { 00:27:48.331 "read": true, 00:27:48.331 "write": true, 00:27:48.331 "unmap": true, 00:27:48.331 "write_zeroes": true, 00:27:48.331 "flush": true, 00:27:48.331 "reset": true, 00:27:48.331 "compare": false, 00:27:48.331 "compare_and_write": false, 00:27:48.331 "abort": true, 00:27:48.331 "nvme_admin": false, 00:27:48.331 "nvme_io": false 00:27:48.331 }, 00:27:48.331 "memory_domains": [ 00:27:48.331 { 00:27:48.331 "dma_device_id": "system", 00:27:48.331 "dma_device_type": 1 00:27:48.331 }, 00:27:48.331 { 00:27:48.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:48.331 "dma_device_type": 2 00:27:48.331 } 00:27:48.331 ], 00:27:48.331 "driver_specific": {} 00:27:48.331 } 00:27:48.331 ] 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.331 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.331 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:27:48.331 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.331 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.331 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.331 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8f84258c-d0ae-5a71-9258-679eb3ca76d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8f84258c-d0ae-5a71-9258-679eb3ca76d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:27:48.332 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 3982909 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@946 -- # '[' -z 3982909 ']' 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # kill -0 3982909 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # uname 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3982909 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3982909' 00:27:48.332 killing process with pid 3982909 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@965 -- # kill 3982909 00:27:48.332 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@970 -- # wait 3982909 00:27:48.898 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:27:48.898 04:28:36 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:48.898 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:27:48.898 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:48.898 04:28:36 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:48.898 ************************************ 00:27:48.898 START TEST bdev_hello_world 00:27:48.898 ************************************ 00:27:48.898 04:28:36 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:27:48.898 [2024-05-15 04:28:36.842454] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:48.898 [2024-05-15 04:28:36.842520] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3983068 ] 00:27:49.157 [2024-05-15 04:28:36.924784] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:49.157 [2024-05-15 04:28:37.052654] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:49.414 [2024-05-15 04:28:37.240154] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:49.414 [2024-05-15 04:28:37.240244] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:49.414 [2024-05-15 04:28:37.240264] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:49.414 [2024-05-15 04:28:37.248171] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:49.414 [2024-05-15 04:28:37.248202] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:49.414 [2024-05-15 04:28:37.248218] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:49.414 [2024-05-15 04:28:37.256203] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:49.414 [2024-05-15 04:28:37.256238] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:49.414 [2024-05-15 04:28:37.256261] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:49.414 [2024-05-15 04:28:37.299666] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:27:49.414 [2024-05-15 04:28:37.299723] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:27:49.414 [2024-05-15 04:28:37.299755] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:27:49.414 [2024-05-15 04:28:37.301116] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:27:49.414 [2024-05-15 04:28:37.301218] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:27:49.414 [2024-05-15 04:28:37.301238] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:27:49.414 [2024-05-15 04:28:37.301272] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:27:49.414 00:27:49.414 [2024-05-15 04:28:37.301292] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:27:49.671 00:27:49.671 real 0m0.786s 00:27:49.671 user 0m0.563s 00:27:49.671 sys 0m0.203s 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:27:49.672 ************************************ 00:27:49.672 END TEST bdev_hello_world 00:27:49.672 ************************************ 00:27:49.672 04:28:37 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:27:49.672 04:28:37 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:49.672 04:28:37 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:49.672 04:28:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:49.672 ************************************ 00:27:49.672 START TEST bdev_bounds 00:27:49.672 ************************************ 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3983217 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3983217' 00:27:49.672 Process bdevio pid: 3983217 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3983217 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 3983217 ']' 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:27:49.672 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:49.672 04:28:37 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:49.672 [2024-05-15 04:28:37.683017] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:49.672 [2024-05-15 04:28:37.683083] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3983217 ] 00:27:49.930 [2024-05-15 04:28:37.764419] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:27:49.930 [2024-05-15 04:28:37.886474] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:27:49.930 [2024-05-15 04:28:37.886542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:27:49.930 [2024-05-15 04:28:37.886544] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:50.188 [2024-05-15 04:28:38.070018] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:50.188 [2024-05-15 04:28:38.070106] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:50.188 [2024-05-15 04:28:38.070123] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:50.188 [2024-05-15 04:28:38.078054] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:50.188 [2024-05-15 04:28:38.078089] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:50.188 [2024-05-15 04:28:38.078103] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:50.188 [2024-05-15 04:28:38.086055] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:50.188 [2024-05-15 04:28:38.086080] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:50.188 [2024-05-15 04:28:38.086092] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:50.753 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:50.753 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:27:50.753 04:28:38 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:27:50.753 I/O targets: 00:27:50.753 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:27:50.753 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:27:50.753 00:27:50.753 00:27:50.753 CUnit - A unit testing framework for C - Version 2.1-3 00:27:50.753 http://cunit.sourceforge.net/ 00:27:50.753 00:27:50.753 00:27:50.754 Suite: bdevio tests on: crypto_ram3 00:27:50.754 Test: blockdev write read block ...passed 00:27:50.754 Test: blockdev write zeroes read block ...passed 00:27:51.011 Test: blockdev write zeroes read no split ...passed 00:27:51.011 Test: blockdev write zeroes read split ...passed 00:27:51.011 Test: blockdev write zeroes read split partial ...passed 00:27:51.011 Test: blockdev reset ...passed 00:27:51.011 Test: blockdev write read 8 blocks ...passed 00:27:51.011 Test: blockdev write read size > 128k ...passed 00:27:51.011 Test: blockdev write read invalid size ...passed 00:27:51.011 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.011 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.011 Test: blockdev write read max offset ...passed 00:27:51.011 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.011 Test: blockdev writev readv 8 blocks ...passed 00:27:51.011 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.011 Test: blockdev writev readv block ...passed 00:27:51.011 Test: blockdev writev readv size > 128k ...passed 00:27:51.011 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.011 Test: blockdev comparev and writev ...passed 00:27:51.011 Test: blockdev nvme passthru rw ...passed 00:27:51.011 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.011 Test: blockdev nvme admin passthru ...passed 00:27:51.011 Test: blockdev copy ...passed 00:27:51.011 Suite: bdevio tests on: crypto_ram 00:27:51.011 Test: blockdev write read block ...passed 00:27:51.011 Test: blockdev write zeroes read block ...passed 00:27:51.011 Test: blockdev write zeroes read no split ...passed 00:27:51.011 Test: blockdev write zeroes read split ...passed 00:27:51.011 Test: blockdev write zeroes read split partial ...passed 00:27:51.011 Test: blockdev reset ...passed 00:27:51.011 Test: blockdev write read 8 blocks ...passed 00:27:51.011 Test: blockdev write read size > 128k ...passed 00:27:51.011 Test: blockdev write read invalid size ...passed 00:27:51.011 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:27:51.011 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:27:51.011 Test: blockdev write read max offset ...passed 00:27:51.011 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:27:51.011 Test: blockdev writev readv 8 blocks ...passed 00:27:51.011 Test: blockdev writev readv 30 x 1block ...passed 00:27:51.011 Test: blockdev writev readv block ...passed 00:27:51.011 Test: blockdev writev readv size > 128k ...passed 00:27:51.011 Test: blockdev writev readv size > 128k in two iovs ...passed 00:27:51.011 Test: blockdev comparev and writev ...passed 00:27:51.011 Test: blockdev nvme passthru rw ...passed 00:27:51.011 Test: blockdev nvme passthru vendor specific ...passed 00:27:51.011 Test: blockdev nvme admin passthru ...passed 00:27:51.011 Test: blockdev copy ...passed 00:27:51.011 00:27:51.011 Run Summary: Type Total Ran Passed Failed Inactive 00:27:51.011 suites 2 2 n/a 0 0 00:27:51.011 tests 46 46 46 0 0 00:27:51.011 asserts 260 260 260 0 n/a 00:27:51.011 00:27:51.011 Elapsed time = 0.099 seconds 00:27:51.011 0 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3983217 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 3983217 ']' 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 3983217 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3983217 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3983217' 00:27:51.011 killing process with pid 3983217 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@965 -- # kill 3983217 00:27:51.011 04:28:38 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@970 -- # wait 3983217 00:27:51.269 04:28:39 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:27:51.269 00:27:51.269 real 0m1.483s 00:27:51.269 user 0m3.932s 00:27:51.269 sys 0m0.332s 00:27:51.269 04:28:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:51.269 04:28:39 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:27:51.269 ************************************ 00:27:51.269 END TEST bdev_bounds 00:27:51.269 ************************************ 00:27:51.270 04:28:39 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:27:51.270 04:28:39 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:27:51.270 04:28:39 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:51.270 04:28:39 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:51.270 ************************************ 00:27:51.270 START TEST bdev_nbd 00:27:51.270 ************************************ 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3983383 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3983383 /var/tmp/spdk-nbd.sock 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 3983383 ']' 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:27:51.270 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:27:51.270 04:28:39 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:51.270 [2024-05-15 04:28:39.219456] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:27:51.270 [2024-05-15 04:28:39.219539] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:51.528 [2024-05-15 04:28:39.296777] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:51.528 [2024-05-15 04:28:39.403724] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:27:51.786 [2024-05-15 04:28:39.581649] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:27:51.786 [2024-05-15 04:28:39.581723] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:27:51.786 [2024-05-15 04:28:39.581743] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 [2024-05-15 04:28:39.589666] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:27:51.786 [2024-05-15 04:28:39.589696] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:27:51.786 [2024-05-15 04:28:39.589711] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:51.786 [2024-05-15 04:28:39.597687] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:27:51.786 [2024-05-15 04:28:39.597716] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:27:51.786 [2024-05-15 04:28:39.597730] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:27:52.366 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.623 1+0 records in 00:27:52.623 1+0 records out 00:27:52.623 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000223988 s, 18.3 MB/s 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:27:52.623 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:52.881 1+0 records in 00:27:52.881 1+0 records out 00:27:52.881 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022689 s, 18.1 MB/s 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:27:52.881 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:53.138 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:27:53.138 { 00:27:53.138 "nbd_device": "/dev/nbd0", 00:27:53.138 "bdev_name": "crypto_ram" 00:27:53.138 }, 00:27:53.138 { 00:27:53.138 "nbd_device": "/dev/nbd1", 00:27:53.138 "bdev_name": "crypto_ram3" 00:27:53.138 } 00:27:53.138 ]' 00:27:53.139 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:27:53.139 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:27:53.139 { 00:27:53.139 "nbd_device": "/dev/nbd0", 00:27:53.139 "bdev_name": "crypto_ram" 00:27:53.139 }, 00:27:53.139 { 00:27:53.139 "nbd_device": "/dev/nbd1", 00:27:53.139 "bdev_name": "crypto_ram3" 00:27:53.139 } 00:27:53.139 ]' 00:27:53.139 04:28:40 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:53.139 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:53.396 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:53.654 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:53.912 04:28:41 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:27:54.169 /dev/nbd0 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:54.169 1+0 records in 00:27:54.169 1+0 records out 00:27:54.169 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258476 s, 15.8 MB/s 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:54.169 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:27:54.427 /dev/nbd1 00:27:54.427 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:54.427 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:54.684 1+0 records in 00:27:54.684 1+0 records out 00:27:54.684 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240526 s, 17.0 MB/s 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:54.684 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:27:54.942 { 00:27:54.942 "nbd_device": "/dev/nbd0", 00:27:54.942 "bdev_name": "crypto_ram" 00:27:54.942 }, 00:27:54.942 { 00:27:54.942 "nbd_device": "/dev/nbd1", 00:27:54.942 "bdev_name": "crypto_ram3" 00:27:54.942 } 00:27:54.942 ]' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:27:54.942 { 00:27:54.942 "nbd_device": "/dev/nbd0", 00:27:54.942 "bdev_name": "crypto_ram" 00:27:54.942 }, 00:27:54.942 { 00:27:54.942 "nbd_device": "/dev/nbd1", 00:27:54.942 "bdev_name": "crypto_ram3" 00:27:54.942 } 00:27:54.942 ]' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:27:54.942 /dev/nbd1' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:27:54.942 /dev/nbd1' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:27:54.942 256+0 records in 00:27:54.942 256+0 records out 00:27:54.942 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00524689 s, 200 MB/s 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:27:54.942 256+0 records in 00:27:54.942 256+0 records out 00:27:54.942 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0247285 s, 42.4 MB/s 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:27:54.942 256+0 records in 00:27:54.942 256+0 records out 00:27:54.942 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.035498 s, 29.5 MB/s 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:54.942 04:28:42 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:55.199 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:55.200 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:55.457 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:27:56.021 04:28:43 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:27:56.021 malloc_lvol_verify 00:27:56.021 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:27:56.279 067b1298-f9e3-46ca-a777-618ae57bfaf0 00:27:56.279 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:27:56.537 69ee0ccc-3763-47aa-9bd5-84815923020f 00:27:56.537 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:27:56.794 /dev/nbd0 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:27:56.794 mke2fs 1.46.5 (30-Dec-2021) 00:27:56.794 Discarding device blocks: 0/4096 done 00:27:56.794 Creating filesystem with 4096 1k blocks and 1024 inodes 00:27:56.794 00:27:56.794 Allocating group tables: 0/1 done 00:27:56.794 Writing inode tables: 0/1 done 00:27:56.794 Creating journal (1024 blocks): done 00:27:56.794 Writing superblocks and filesystem accounting information: 0/1 done 00:27:56.794 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:56.794 04:28:44 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3983383 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 3983383 ']' 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 3983383 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3983383 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3983383' 00:27:57.052 killing process with pid 3983383 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@965 -- # kill 3983383 00:27:57.052 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@970 -- # wait 3983383 00:27:57.310 04:28:45 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:27:57.310 00:27:57.310 real 0m6.138s 00:27:57.310 user 0m8.966s 00:27:57.310 sys 0m2.218s 00:27:57.310 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:27:57.310 04:28:45 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:27:57.310 ************************************ 00:27:57.310 END TEST bdev_nbd 00:27:57.310 ************************************ 00:27:57.568 04:28:45 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:27:57.568 04:28:45 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:27:57.568 04:28:45 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:27:57.568 04:28:45 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:27:57.568 04:28:45 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:27:57.568 04:28:45 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:57.568 04:28:45 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:27:57.568 ************************************ 00:27:57.568 START TEST bdev_fio 00:27:57.568 ************************************ 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:27:57.568 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:27:57.568 04:28:45 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:27:57.568 ************************************ 00:27:57.568 START TEST bdev_fio_rw_verify 00:27:57.568 ************************************ 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:27:57.569 04:28:45 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:27:57.827 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.827 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:27:57.827 fio-3.35 00:27:57.827 Starting 2 threads 00:28:10.020 00:28:10.020 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3984389: Wed May 15 04:28:56 2024 00:28:10.020 read: IOPS=19.6k, BW=76.5MiB/s (80.2MB/s)(765MiB/10000msec) 00:28:10.020 slat (usec): min=10, max=1812, avg=24.09, stdev= 9.65 00:28:10.020 clat (usec): min=6, max=2124, avg=161.51, stdev=70.86 00:28:10.020 lat (usec): min=22, max=2152, avg=185.60, stdev=74.40 00:28:10.020 clat percentiles (usec): 00:28:10.020 | 50.000th=[ 155], 99.000th=[ 347], 99.900th=[ 408], 99.990th=[ 445], 00:28:10.020 | 99.999th=[ 2089] 00:28:10.020 write: IOPS=23.5k, BW=91.9MiB/s (96.3MB/s)(874MiB/9515msec); 0 zone resets 00:28:10.020 slat (usec): min=12, max=232, avg=38.79, stdev= 9.88 00:28:10.020 clat (usec): min=21, max=1079, avg=221.38, stdev=102.85 00:28:10.020 lat (usec): min=48, max=1119, avg=260.17, stdev=105.08 00:28:10.020 clat percentiles (usec): 00:28:10.020 | 50.000th=[ 215], 99.000th=[ 453], 99.900th=[ 519], 99.990th=[ 578], 00:28:10.020 | 99.999th=[ 988] 00:28:10.020 bw ( KiB/s): min=79464, max=120608, per=95.16%, avg=89533.05, stdev=5136.97, samples=38 00:28:10.020 iops : min=19866, max=30152, avg=22383.26, stdev=1284.24, samples=38 00:28:10.020 lat (usec) : 10=0.01%, 20=0.01%, 50=2.05%, 100=15.17%, 250=56.43% 00:28:10.020 lat (usec) : 500=26.24%, 750=0.10%, 1000=0.01% 00:28:10.020 lat (msec) : 2=0.01%, 4=0.01% 00:28:10.020 cpu : usr=99.44%, sys=0.01%, ctx=70, majf=0, minf=481 00:28:10.020 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:10.020 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:10.020 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:10.020 issued rwts: total=195814,223800,0,0 short=0,0,0,0 dropped=0,0,0,0 00:28:10.020 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:10.020 00:28:10.020 Run status group 0 (all jobs): 00:28:10.020 READ: bw=76.5MiB/s (80.2MB/s), 76.5MiB/s-76.5MiB/s (80.2MB/s-80.2MB/s), io=765MiB (802MB), run=10000-10000msec 00:28:10.020 WRITE: bw=91.9MiB/s (96.3MB/s), 91.9MiB/s-91.9MiB/s (96.3MB/s-96.3MB/s), io=874MiB (917MB), run=9515-9515msec 00:28:10.020 00:28:10.020 real 0m11.030s 00:28:10.020 user 0m20.883s 00:28:10.020 sys 0m0.280s 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:28:10.020 ************************************ 00:28:10.020 END TEST bdev_fio_rw_verify 00:28:10.020 ************************************ 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8f84258c-d0ae-5a71-9258-679eb3ca76d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8f84258c-d0ae-5a71-9258-679eb3ca76d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:28:10.020 crypto_ram3 ]] 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8f84258c-d0ae-5a71-9258-679eb3ca76d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8f84258c-d0ae-5a71-9258-679eb3ca76d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1a64b164-b52b-53ec-b6e1-8ebbfb6e3fc2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:28:10.020 04:28:56 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:10.021 ************************************ 00:28:10.021 START TEST bdev_fio_trim 00:28:10.021 ************************************ 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:28:10.021 04:28:56 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:28:10.021 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:10.021 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:28:10.021 fio-3.35 00:28:10.021 Starting 2 threads 00:28:19.983 00:28:19.983 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=3985784: Wed May 15 04:29:07 2024 00:28:19.983 write: IOPS=31.6k, BW=123MiB/s (129MB/s)(1234MiB/10001msec); 0 zone resets 00:28:19.983 slat (usec): min=10, max=382, avg=28.36, stdev=10.21 00:28:19.983 clat (usec): min=27, max=1673, avg=203.97, stdev=137.09 00:28:19.983 lat (usec): min=39, max=1691, avg=232.33, stdev=144.28 00:28:19.983 clat percentiles (usec): 00:28:19.983 | 50.000th=[ 159], 99.000th=[ 529], 99.900th=[ 570], 99.990th=[ 611], 00:28:19.983 | 99.999th=[ 947] 00:28:19.983 bw ( KiB/s): min=109920, max=184064, per=99.90%, avg=126235.37, stdev=13531.13, samples=38 00:28:19.983 iops : min=27480, max=46016, avg=31558.84, stdev=3382.78, samples=38 00:28:19.983 trim: IOPS=31.6k, BW=123MiB/s (129MB/s)(1234MiB/10001msec); 0 zone resets 00:28:19.983 slat (usec): min=4, max=1546, avg=12.96, stdev= 6.03 00:28:19.983 clat (usec): min=7, max=1691, avg=135.04, stdev=55.27 00:28:19.983 lat (usec): min=18, max=1701, avg=147.99, stdev=57.44 00:28:19.983 clat percentiles (usec): 00:28:19.983 | 50.000th=[ 126], 99.000th=[ 265], 99.900th=[ 293], 99.990th=[ 314], 00:28:19.983 | 99.999th=[ 619] 00:28:19.983 bw ( KiB/s): min=109920, max=184064, per=99.90%, avg=126236.63, stdev=13532.49, samples=38 00:28:19.983 iops : min=27480, max=46016, avg=31559.16, stdev=3383.12, samples=38 00:28:19.983 lat (usec) : 10=0.01%, 50=4.85%, 100=25.17%, 250=52.95%, 500=15.85% 00:28:19.983 lat (usec) : 750=1.18%, 1000=0.01% 00:28:19.983 lat (msec) : 2=0.01% 00:28:19.983 cpu : usr=99.39%, sys=0.03%, ctx=28, majf=0, minf=381 00:28:19.983 IO depths : 1=7.7%, 2=17.8%, 4=59.7%, 8=14.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:28:19.983 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:19.983 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:28:19.983 issued rwts: total=0,315943,315945,0 short=0,0,0,0 dropped=0,0,0,0 00:28:19.983 latency : target=0, window=0, percentile=100.00%, depth=8 00:28:19.983 00:28:19.983 Run status group 0 (all jobs): 00:28:19.983 WRITE: bw=123MiB/s (129MB/s), 123MiB/s-123MiB/s (129MB/s-129MB/s), io=1234MiB (1294MB), run=10001-10001msec 00:28:19.983 TRIM: bw=123MiB/s (129MB/s), 123MiB/s-123MiB/s (129MB/s-129MB/s), io=1234MiB (1294MB), run=10001-10001msec 00:28:19.983 00:28:19.983 real 0m10.993s 00:28:19.983 user 0m20.912s 00:28:19.983 sys 0m0.280s 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:28:19.983 ************************************ 00:28:19.983 END TEST bdev_fio_trim 00:28:19.983 ************************************ 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:28:19.983 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:28:19.983 00:28:19.983 real 0m22.262s 00:28:19.983 user 0m41.935s 00:28:19.983 sys 0m0.666s 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:28:19.983 ************************************ 00:28:19.983 END TEST bdev_fio 00:28:19.983 ************************************ 00:28:19.983 04:29:07 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:19.983 04:29:07 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:19.983 04:29:07 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:28:19.983 04:29:07 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:19.983 04:29:07 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:19.983 ************************************ 00:28:19.983 START TEST bdev_verify 00:28:19.983 ************************************ 00:28:19.983 04:29:07 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:28:19.983 [2024-05-15 04:29:07.722785] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:19.983 [2024-05-15 04:29:07.722871] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3987114 ] 00:28:19.983 [2024-05-15 04:29:07.798907] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:19.983 [2024-05-15 04:29:07.916241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:19.983 [2024-05-15 04:29:07.916245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:20.241 [2024-05-15 04:29:08.113942] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:20.241 [2024-05-15 04:29:08.114039] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:20.241 [2024-05-15 04:29:08.114056] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.241 [2024-05-15 04:29:08.121968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:20.241 [2024-05-15 04:29:08.121997] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:20.241 [2024-05-15 04:29:08.122011] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.241 [2024-05-15 04:29:08.129978] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:20.241 [2024-05-15 04:29:08.130011] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:20.241 [2024-05-15 04:29:08.130038] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:20.241 Running I/O for 5 seconds... 00:28:25.512 00:28:25.512 Latency(us) 00:28:25.512 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.512 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:25.512 Verification LBA range: start 0x0 length 0x800 00:28:25.512 crypto_ram : 5.02 8011.77 31.30 0.00 0.00 15928.09 2281.62 16117.00 00:28:25.512 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:25.512 Verification LBA range: start 0x800 length 0x800 00:28:25.512 crypto_ram : 5.02 8011.79 31.30 0.00 0.00 15927.59 2281.62 16214.09 00:28:25.512 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:28:25.512 Verification LBA range: start 0x0 length 0x800 00:28:25.512 crypto_ram3 : 5.02 4002.31 15.63 0.00 0.00 31845.84 10777.03 20874.43 00:28:25.512 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:28:25.512 Verification LBA range: start 0x800 length 0x800 00:28:25.512 crypto_ram3 : 5.02 4002.40 15.63 0.00 0.00 31843.31 10631.40 20777.34 00:28:25.512 =================================================================================================================== 00:28:25.512 Total : 24028.27 93.86 0.00 0.00 21233.42 2281.62 20874.43 00:28:25.512 00:28:25.512 real 0m5.820s 00:28:25.512 user 0m10.960s 00:28:25.512 sys 0m0.214s 00:28:25.512 04:29:13 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:25.512 04:29:13 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:28:25.512 ************************************ 00:28:25.512 END TEST bdev_verify 00:28:25.512 ************************************ 00:28:25.512 04:29:13 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:25.512 04:29:13 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:28:25.512 04:29:13 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:25.512 04:29:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:25.769 ************************************ 00:28:25.769 START TEST bdev_verify_big_io 00:28:25.769 ************************************ 00:28:25.769 04:29:13 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:28:25.769 [2024-05-15 04:29:13.591678] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:25.769 [2024-05-15 04:29:13.591757] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3987795 ] 00:28:25.769 [2024-05-15 04:29:13.673139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:26.027 [2024-05-15 04:29:13.803093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:26.027 [2024-05-15 04:29:13.803099] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:26.027 [2024-05-15 04:29:13.992058] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:26.027 [2024-05-15 04:29:13.992153] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:26.027 [2024-05-15 04:29:13.992174] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:26.027 [2024-05-15 04:29:14.000061] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:26.027 [2024-05-15 04:29:14.000096] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:26.027 [2024-05-15 04:29:14.000109] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:26.027 [2024-05-15 04:29:14.008081] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:26.027 [2024-05-15 04:29:14.008105] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:26.027 [2024-05-15 04:29:14.008133] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:26.316 Running I/O for 5 seconds... 00:28:31.607 00:28:31.607 Latency(us) 00:28:31.607 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:31.607 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:31.607 Verification LBA range: start 0x0 length 0x80 00:28:31.607 crypto_ram : 5.13 523.53 32.72 0.00 0.00 239178.36 7281.78 335544.32 00:28:31.607 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:31.607 Verification LBA range: start 0x80 length 0x80 00:28:31.607 crypto_ram : 5.10 527.51 32.97 0.00 0.00 237602.89 7039.05 330883.98 00:28:31.607 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:28:31.607 Verification LBA range: start 0x0 length 0x80 00:28:31.607 crypto_ram3 : 5.24 292.90 18.31 0.00 0.00 413473.00 6165.24 343311.55 00:28:31.607 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:28:31.607 Verification LBA range: start 0x80 length 0x80 00:28:31.607 crypto_ram3 : 5.21 294.74 18.42 0.00 0.00 411164.65 6165.24 341758.10 00:28:31.607 =================================================================================================================== 00:28:31.607 Total : 1638.69 102.42 0.00 0.00 301637.24 6165.24 343311.55 00:28:31.607 00:28:31.607 real 0m6.076s 00:28:31.607 user 0m11.452s 00:28:31.607 sys 0m0.225s 00:28:31.607 04:29:19 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:31.607 04:29:19 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:28:31.607 ************************************ 00:28:31.607 END TEST bdev_verify_big_io 00:28:31.607 ************************************ 00:28:31.866 04:29:19 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:31.866 04:29:19 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:31.866 04:29:19 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:31.866 04:29:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:31.866 ************************************ 00:28:31.866 START TEST bdev_write_zeroes 00:28:31.866 ************************************ 00:28:31.866 04:29:19 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:31.866 [2024-05-15 04:29:19.726376] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:31.866 [2024-05-15 04:29:19.726436] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3988474 ] 00:28:31.866 [2024-05-15 04:29:19.807681] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:32.124 [2024-05-15 04:29:19.928155] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:32.124 [2024-05-15 04:29:20.107705] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:32.124 [2024-05-15 04:29:20.107801] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:32.124 [2024-05-15 04:29:20.107819] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:32.124 [2024-05-15 04:29:20.115720] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:28:32.124 [2024-05-15 04:29:20.115760] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:32.124 [2024-05-15 04:29:20.115773] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:32.124 [2024-05-15 04:29:20.123741] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:28:32.124 [2024-05-15 04:29:20.123765] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:28:32.124 [2024-05-15 04:29:20.123792] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:32.382 Running I/O for 1 seconds... 00:28:33.316 00:28:33.316 Latency(us) 00:28:33.316 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:33.316 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:33.316 crypto_ram : 1.01 30047.66 117.37 0.00 0.00 4249.96 2014.63 6747.78 00:28:33.316 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:28:33.316 crypto_ram3 : 1.01 15050.42 58.79 0.00 0.00 8443.44 3179.71 10145.94 00:28:33.316 =================================================================================================================== 00:28:33.316 Total : 45098.08 176.16 0.00 0.00 5651.72 2014.63 10145.94 00:28:33.575 00:28:33.575 real 0m1.789s 00:28:33.575 user 0m1.556s 00:28:33.575 sys 0m0.213s 00:28:33.575 04:29:21 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:33.575 04:29:21 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:28:33.575 ************************************ 00:28:33.575 END TEST bdev_write_zeroes 00:28:33.575 ************************************ 00:28:33.575 04:29:21 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:33.575 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:33.575 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:33.575 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:33.575 ************************************ 00:28:33.575 START TEST bdev_json_nonenclosed 00:28:33.575 ************************************ 00:28:33.575 04:29:21 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:33.575 [2024-05-15 04:29:21.565853] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:33.575 [2024-05-15 04:29:21.565924] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3988752 ] 00:28:33.832 [2024-05-15 04:29:21.646908] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:33.832 [2024-05-15 04:29:21.766542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.832 [2024-05-15 04:29:21.766667] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:28:33.832 [2024-05-15 04:29:21.766695] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:33.832 [2024-05-15 04:29:21.766711] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:34.091 00:28:34.091 real 0m0.389s 00:28:34.091 user 0m0.275s 00:28:34.091 sys 0m0.112s 00:28:34.091 04:29:21 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:34.091 04:29:21 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:28:34.091 ************************************ 00:28:34.091 END TEST bdev_json_nonenclosed 00:28:34.091 ************************************ 00:28:34.091 04:29:21 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:34.091 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:28:34.091 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:34.091 04:29:21 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:34.091 ************************************ 00:28:34.091 START TEST bdev_json_nonarray 00:28:34.091 ************************************ 00:28:34.091 04:29:21 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:28:34.091 [2024-05-15 04:29:22.009322] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:34.091 [2024-05-15 04:29:22.009403] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3988775 ] 00:28:34.091 [2024-05-15 04:29:22.087315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:34.349 [2024-05-15 04:29:22.205060] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:34.349 [2024-05-15 04:29:22.205177] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:28:34.349 [2024-05-15 04:29:22.205202] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:28:34.349 [2024-05-15 04:29:22.205215] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:28:34.349 00:28:34.349 real 0m0.376s 00:28:34.349 user 0m0.264s 00:28:34.349 sys 0m0.110s 00:28:34.349 04:29:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:34.349 04:29:22 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:28:34.349 ************************************ 00:28:34.349 END TEST bdev_json_nonarray 00:28:34.349 ************************************ 00:28:34.349 04:29:22 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:28:34.349 04:29:22 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:28:34.349 04:29:22 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:28:34.349 04:29:22 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:28:34.349 04:29:22 blockdev_crypto_sw -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:28:34.349 04:29:22 blockdev_crypto_sw -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:34.349 04:29:22 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:34.606 ************************************ 00:28:34.606 START TEST bdev_crypto_enomem 00:28:34.607 ************************************ 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1121 -- # bdev_crypto_enomem 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=3988805 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 3988805 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@827 -- # '[' -z 3988805 ']' 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:34.607 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:34.607 04:29:22 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:34.607 [2024-05-15 04:29:22.442275] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:34.607 [2024-05-15 04:29:22.442340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3988805 ] 00:28:34.607 [2024-05-15 04:29:22.523360] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:34.864 [2024-05-15 04:29:22.639907] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:35.429 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:35.429 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # return 0 00:28:35.429 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:28:35.429 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:35.429 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:35.429 true 00:28:35.686 base0 00:28:35.686 true 00:28:35.686 [2024-05-15 04:29:23.464419] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:28:35.686 crypt0 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@895 -- # local bdev_name=crypt0 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@896 -- # local bdev_timeout= 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local i 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # [[ -z '' ]] 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # bdev_timeout=2000 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # rpc_cmd bdev_wait_for_examine 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:35.686 [ 00:28:35.686 { 00:28:35.686 "name": "crypt0", 00:28:35.686 "aliases": [ 00:28:35.686 "5945a895-6b26-57d1-8d43-a11fc02fb34c" 00:28:35.686 ], 00:28:35.686 "product_name": "crypto", 00:28:35.686 "block_size": 512, 00:28:35.686 "num_blocks": 2097152, 00:28:35.686 "uuid": "5945a895-6b26-57d1-8d43-a11fc02fb34c", 00:28:35.686 "assigned_rate_limits": { 00:28:35.686 "rw_ios_per_sec": 0, 00:28:35.686 "rw_mbytes_per_sec": 0, 00:28:35.686 "r_mbytes_per_sec": 0, 00:28:35.686 "w_mbytes_per_sec": 0 00:28:35.686 }, 00:28:35.686 "claimed": false, 00:28:35.686 "zoned": false, 00:28:35.686 "supported_io_types": { 00:28:35.686 "read": true, 00:28:35.686 "write": true, 00:28:35.686 "unmap": false, 00:28:35.686 "write_zeroes": true, 00:28:35.686 "flush": false, 00:28:35.686 "reset": true, 00:28:35.686 "compare": false, 00:28:35.686 "compare_and_write": false, 00:28:35.686 "abort": false, 00:28:35.686 "nvme_admin": false, 00:28:35.686 "nvme_io": false 00:28:35.686 }, 00:28:35.686 "memory_domains": [ 00:28:35.686 { 00:28:35.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:35.686 "dma_device_type": 2 00:28:35.686 } 00:28:35.686 ], 00:28:35.686 "driver_specific": { 00:28:35.686 "crypto": { 00:28:35.686 "base_bdev_name": "EE_base0", 00:28:35.686 "name": "crypt0", 00:28:35.686 "key_name": "test_dek_sw" 00:28:35.686 } 00:28:35.686 } 00:28:35.686 } 00:28:35.686 ] 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # return 0 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=3988939 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:28:35.686 04:29:23 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:35.686 Running I/O for 5 seconds... 00:28:36.618 04:29:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:28:36.618 04:29:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:36.618 04:29:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:36.618 04:29:24 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:36.618 04:29:24 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 3988939 00:28:40.802 00:28:40.803 Latency(us) 00:28:40.803 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:40.803 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:28:40.803 crypt0 : 5.00 35731.84 139.58 0.00 0.00 891.37 412.63 1213.63 00:28:40.803 =================================================================================================================== 00:28:40.803 Total : 35731.84 139.58 0.00 0.00 891.37 412.63 1213.63 00:28:40.803 0 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 3988805 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@946 -- # '[' -z 3988805 ']' 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # kill -0 3988805 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # uname 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3988805 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3988805' 00:28:40.803 killing process with pid 3988805 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@965 -- # kill 3988805 00:28:40.803 Received shutdown signal, test time was about 5.000000 seconds 00:28:40.803 00:28:40.803 Latency(us) 00:28:40.803 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:40.803 =================================================================================================================== 00:28:40.803 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:40.803 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@970 -- # wait 3988805 00:28:41.061 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:28:41.061 00:28:41.061 real 0m6.533s 00:28:41.061 user 0m6.823s 00:28:41.061 sys 0m0.333s 00:28:41.061 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:41.061 04:29:28 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:28:41.061 ************************************ 00:28:41.061 END TEST bdev_crypto_enomem 00:28:41.061 ************************************ 00:28:41.061 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:28:41.061 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:28:41.061 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:28:41.061 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:41.062 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:28:41.062 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:28:41.062 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:28:41.062 04:29:28 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:28:41.062 00:28:41.062 real 0m53.530s 00:28:41.062 user 1m28.230s 00:28:41.062 sys 0m5.424s 00:28:41.062 04:29:28 blockdev_crypto_sw -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:41.062 04:29:28 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:28:41.062 ************************************ 00:28:41.062 END TEST blockdev_crypto_sw 00:28:41.062 ************************************ 00:28:41.062 04:29:28 -- spdk/autotest.sh@355 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:28:41.062 04:29:28 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:41.062 04:29:28 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:41.062 04:29:28 -- common/autotest_common.sh@10 -- # set +x 00:28:41.062 ************************************ 00:28:41.062 START TEST blockdev_crypto_qat 00:28:41.062 ************************************ 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:28:41.062 * Looking for test storage... 00:28:41.062 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=3989655 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:28:41.062 04:29:29 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 3989655 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@827 -- # '[' -z 3989655 ']' 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:41.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:41.062 04:29:29 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:41.320 [2024-05-15 04:29:29.126047] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:41.320 [2024-05-15 04:29:29.126139] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3989655 ] 00:28:41.320 [2024-05-15 04:29:29.219588] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:41.577 [2024-05-15 04:29:29.360653] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:42.142 04:29:30 blockdev_crypto_qat -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:42.142 04:29:30 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # return 0 00:28:42.142 04:29:30 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:28:42.142 04:29:30 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:28:42.142 04:29:30 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:28:42.142 04:29:30 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:42.142 04:29:30 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:42.142 [2024-05-15 04:29:30.127148] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:42.142 [2024-05-15 04:29:30.135176] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:42.142 [2024-05-15 04:29:30.143192] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:42.400 [2024-05-15 04:29:30.222356] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:44.928 true 00:28:44.928 true 00:28:44.928 true 00:28:44.928 true 00:28:44.928 Malloc0 00:28:44.928 Malloc1 00:28:44.928 Malloc2 00:28:44.928 Malloc3 00:28:44.928 [2024-05-15 04:29:32.803612] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:44.928 crypto_ram 00:28:44.928 [2024-05-15 04:29:32.811617] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:44.928 crypto_ram1 00:28:44.928 [2024-05-15 04:29:32.819637] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:44.928 crypto_ram2 00:28:44.928 [2024-05-15 04:29:32.827660] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:44.928 crypto_ram3 00:28:44.928 [ 00:28:44.928 { 00:28:44.928 "name": "Malloc1", 00:28:44.928 "aliases": [ 00:28:44.928 "7f9b934a-cdc9-4a94-9f82-bebd67a73795" 00:28:44.928 ], 00:28:44.928 "product_name": "Malloc disk", 00:28:44.928 "block_size": 512, 00:28:44.928 "num_blocks": 65536, 00:28:44.928 "uuid": "7f9b934a-cdc9-4a94-9f82-bebd67a73795", 00:28:44.928 "assigned_rate_limits": { 00:28:44.928 "rw_ios_per_sec": 0, 00:28:44.928 "rw_mbytes_per_sec": 0, 00:28:44.928 "r_mbytes_per_sec": 0, 00:28:44.928 "w_mbytes_per_sec": 0 00:28:44.928 }, 00:28:44.928 "claimed": true, 00:28:44.928 "claim_type": "exclusive_write", 00:28:44.928 "zoned": false, 00:28:44.928 "supported_io_types": { 00:28:44.928 "read": true, 00:28:44.928 "write": true, 00:28:44.928 "unmap": true, 00:28:44.928 "write_zeroes": true, 00:28:44.928 "flush": true, 00:28:44.928 "reset": true, 00:28:44.928 "compare": false, 00:28:44.928 "compare_and_write": false, 00:28:44.928 "abort": true, 00:28:44.928 "nvme_admin": false, 00:28:44.928 "nvme_io": false 00:28:44.928 }, 00:28:44.928 "memory_domains": [ 00:28:44.928 { 00:28:44.928 "dma_device_id": "system", 00:28:44.928 "dma_device_type": 1 00:28:44.928 }, 00:28:44.928 { 00:28:44.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:44.928 "dma_device_type": 2 00:28:44.928 } 00:28:44.928 ], 00:28:44.928 "driver_specific": {} 00:28:44.928 } 00:28:44.928 ] 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:44.928 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:44.928 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:28:44.929 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:28:44.929 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:28:44.929 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:44.929 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:44.929 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b041ee2b-5ed1-5097-8209-94a0fb84905b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b041ee2b-5ed1-5097-8209-94a0fb84905b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "6fe2fcad-1273-5fe2-86de-9c76b3516313"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6fe2fcad-1273-5fe2-86de-9c76b3516313",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cf785a2a-5204-5fd2-933b-519f170fc943"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cf785a2a-5204-5fd2-933b-519f170fc943",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:28:45.187 04:29:32 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 3989655 00:28:45.187 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@946 -- # '[' -z 3989655 ']' 00:28:45.187 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # kill -0 3989655 00:28:45.187 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # uname 00:28:45.187 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:45.187 04:29:32 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3989655 00:28:45.187 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:45.187 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:45.187 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3989655' 00:28:45.187 killing process with pid 3989655 00:28:45.187 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@965 -- # kill 3989655 00:28:45.187 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@970 -- # wait 3989655 00:28:45.753 04:29:33 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:28:45.753 04:29:33 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:45.753 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:28:45.753 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:45.753 04:29:33 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:45.753 ************************************ 00:28:45.753 START TEST bdev_hello_world 00:28:45.753 ************************************ 00:28:45.753 04:29:33 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:28:46.011 [2024-05-15 04:29:33.800482] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:46.011 [2024-05-15 04:29:33.800539] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3990201 ] 00:28:46.011 [2024-05-15 04:29:33.895382] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:46.269 [2024-05-15 04:29:34.036015] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:46.269 [2024-05-15 04:29:34.057300] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:46.269 [2024-05-15 04:29:34.065326] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:46.269 [2024-05-15 04:29:34.073353] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:46.269 [2024-05-15 04:29:34.193166] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:48.798 [2024-05-15 04:29:36.611864] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:48.798 [2024-05-15 04:29:36.611962] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:48.798 [2024-05-15 04:29:36.611983] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:48.798 [2024-05-15 04:29:36.619876] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:48.798 [2024-05-15 04:29:36.619904] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:48.798 [2024-05-15 04:29:36.619920] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:48.798 [2024-05-15 04:29:36.627915] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:48.798 [2024-05-15 04:29:36.627944] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:48.798 [2024-05-15 04:29:36.627960] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:48.798 [2024-05-15 04:29:36.635913] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:48.798 [2024-05-15 04:29:36.635940] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:48.798 [2024-05-15 04:29:36.635954] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:48.798 [2024-05-15 04:29:36.721605] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:28:48.798 [2024-05-15 04:29:36.721671] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:28:48.798 [2024-05-15 04:29:36.721693] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:28:48.798 [2024-05-15 04:29:36.722974] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:28:48.798 [2024-05-15 04:29:36.723064] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:28:48.798 [2024-05-15 04:29:36.723089] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:28:48.798 [2024-05-15 04:29:36.723157] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:28:48.798 00:28:48.798 [2024-05-15 04:29:36.723195] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:28:49.364 00:28:49.364 real 0m3.447s 00:28:49.364 user 0m2.869s 00:28:49.364 sys 0m0.534s 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:28:49.364 ************************************ 00:28:49.364 END TEST bdev_hello_world 00:28:49.364 ************************************ 00:28:49.364 04:29:37 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:28:49.364 04:29:37 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:28:49.364 04:29:37 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:49.364 04:29:37 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:49.364 ************************************ 00:28:49.364 START TEST bdev_bounds 00:28:49.364 ************************************ 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1121 -- # bdev_bounds '' 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=3990626 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 3990626' 00:28:49.364 Process bdevio pid: 3990626 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 3990626 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@827 -- # '[' -z 3990626 ']' 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:49.364 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:49.364 04:29:37 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:49.364 [2024-05-15 04:29:37.306278] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:49.364 [2024-05-15 04:29:37.306345] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3990626 ] 00:28:49.622 [2024-05-15 04:29:37.390784] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:49.622 [2024-05-15 04:29:37.514014] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:28:49.622 [2024-05-15 04:29:37.514070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:28:49.622 [2024-05-15 04:29:37.514074] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:49.622 [2024-05-15 04:29:37.535742] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:49.622 [2024-05-15 04:29:37.543764] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:49.622 [2024-05-15 04:29:37.551782] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:49.879 [2024-05-15 04:29:37.670121] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:52.405 [2024-05-15 04:29:40.003537] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:52.405 [2024-05-15 04:29:40.003655] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:52.405 [2024-05-15 04:29:40.003681] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:52.405 [2024-05-15 04:29:40.011551] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:52.405 [2024-05-15 04:29:40.011609] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:52.405 [2024-05-15 04:29:40.011631] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:52.405 [2024-05-15 04:29:40.019574] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:52.405 [2024-05-15 04:29:40.019622] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:52.405 [2024-05-15 04:29:40.019645] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:52.405 [2024-05-15 04:29:40.027594] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:52.405 [2024-05-15 04:29:40.027642] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:52.405 [2024-05-15 04:29:40.027663] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:52.405 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:52.405 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # return 0 00:28:52.405 04:29:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:28:52.405 I/O targets: 00:28:52.405 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:28:52.405 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:28:52.405 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:28:52.405 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:28:52.405 00:28:52.405 00:28:52.405 CUnit - A unit testing framework for C - Version 2.1-3 00:28:52.405 http://cunit.sourceforge.net/ 00:28:52.405 00:28:52.405 00:28:52.405 Suite: bdevio tests on: crypto_ram3 00:28:52.405 Test: blockdev write read block ...passed 00:28:52.405 Test: blockdev write zeroes read block ...passed 00:28:52.405 Test: blockdev write zeroes read no split ...passed 00:28:52.405 Test: blockdev write zeroes read split ...passed 00:28:52.405 Test: blockdev write zeroes read split partial ...passed 00:28:52.405 Test: blockdev reset ...passed 00:28:52.405 Test: blockdev write read 8 blocks ...passed 00:28:52.405 Test: blockdev write read size > 128k ...passed 00:28:52.405 Test: blockdev write read invalid size ...passed 00:28:52.405 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.405 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.405 Test: blockdev write read max offset ...passed 00:28:52.405 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.405 Test: blockdev writev readv 8 blocks ...passed 00:28:52.405 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.405 Test: blockdev writev readv block ...passed 00:28:52.405 Test: blockdev writev readv size > 128k ...passed 00:28:52.405 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.405 Test: blockdev comparev and writev ...passed 00:28:52.405 Test: blockdev nvme passthru rw ...passed 00:28:52.405 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.405 Test: blockdev nvme admin passthru ...passed 00:28:52.405 Test: blockdev copy ...passed 00:28:52.405 Suite: bdevio tests on: crypto_ram2 00:28:52.405 Test: blockdev write read block ...passed 00:28:52.405 Test: blockdev write zeroes read block ...passed 00:28:52.405 Test: blockdev write zeroes read no split ...passed 00:28:52.405 Test: blockdev write zeroes read split ...passed 00:28:52.405 Test: blockdev write zeroes read split partial ...passed 00:28:52.405 Test: blockdev reset ...passed 00:28:52.405 Test: blockdev write read 8 blocks ...passed 00:28:52.405 Test: blockdev write read size > 128k ...passed 00:28:52.405 Test: blockdev write read invalid size ...passed 00:28:52.405 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.405 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.405 Test: blockdev write read max offset ...passed 00:28:52.405 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.405 Test: blockdev writev readv 8 blocks ...passed 00:28:52.405 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.405 Test: blockdev writev readv block ...passed 00:28:52.405 Test: blockdev writev readv size > 128k ...passed 00:28:52.405 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.405 Test: blockdev comparev and writev ...passed 00:28:52.405 Test: blockdev nvme passthru rw ...passed 00:28:52.405 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.405 Test: blockdev nvme admin passthru ...passed 00:28:52.405 Test: blockdev copy ...passed 00:28:52.405 Suite: bdevio tests on: crypto_ram1 00:28:52.405 Test: blockdev write read block ...passed 00:28:52.405 Test: blockdev write zeroes read block ...passed 00:28:52.405 Test: blockdev write zeroes read no split ...passed 00:28:52.405 Test: blockdev write zeroes read split ...passed 00:28:52.664 Test: blockdev write zeroes read split partial ...passed 00:28:52.664 Test: blockdev reset ...passed 00:28:52.664 Test: blockdev write read 8 blocks ...passed 00:28:52.664 Test: blockdev write read size > 128k ...passed 00:28:52.664 Test: blockdev write read invalid size ...passed 00:28:52.664 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.664 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.664 Test: blockdev write read max offset ...passed 00:28:52.664 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.664 Test: blockdev writev readv 8 blocks ...passed 00:28:52.664 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.664 Test: blockdev writev readv block ...passed 00:28:52.664 Test: blockdev writev readv size > 128k ...passed 00:28:52.664 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.664 Test: blockdev comparev and writev ...passed 00:28:52.664 Test: blockdev nvme passthru rw ...passed 00:28:52.664 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.664 Test: blockdev nvme admin passthru ...passed 00:28:52.664 Test: blockdev copy ...passed 00:28:52.664 Suite: bdevio tests on: crypto_ram 00:28:52.664 Test: blockdev write read block ...passed 00:28:52.664 Test: blockdev write zeroes read block ...passed 00:28:52.664 Test: blockdev write zeroes read no split ...passed 00:28:52.664 Test: blockdev write zeroes read split ...passed 00:28:52.664 Test: blockdev write zeroes read split partial ...passed 00:28:52.664 Test: blockdev reset ...passed 00:28:52.664 Test: blockdev write read 8 blocks ...passed 00:28:52.664 Test: blockdev write read size > 128k ...passed 00:28:52.664 Test: blockdev write read invalid size ...passed 00:28:52.664 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:28:52.664 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:28:52.664 Test: blockdev write read max offset ...passed 00:28:52.664 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:28:52.664 Test: blockdev writev readv 8 blocks ...passed 00:28:52.664 Test: blockdev writev readv 30 x 1block ...passed 00:28:52.664 Test: blockdev writev readv block ...passed 00:28:52.664 Test: blockdev writev readv size > 128k ...passed 00:28:52.664 Test: blockdev writev readv size > 128k in two iovs ...passed 00:28:52.664 Test: blockdev comparev and writev ...passed 00:28:52.664 Test: blockdev nvme passthru rw ...passed 00:28:52.664 Test: blockdev nvme passthru vendor specific ...passed 00:28:52.664 Test: blockdev nvme admin passthru ...passed 00:28:52.664 Test: blockdev copy ...passed 00:28:52.664 00:28:52.664 Run Summary: Type Total Ran Passed Failed Inactive 00:28:52.664 suites 4 4 n/a 0 0 00:28:52.664 tests 92 92 92 0 0 00:28:52.664 asserts 520 520 520 0 n/a 00:28:52.664 00:28:52.664 Elapsed time = 0.617 seconds 00:28:52.664 0 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 3990626 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@946 -- # '[' -z 3990626 ']' 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # kill -0 3990626 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # uname 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3990626 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3990626' 00:28:52.664 killing process with pid 3990626 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@965 -- # kill 3990626 00:28:52.664 04:29:40 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@970 -- # wait 3990626 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:28:53.230 00:28:53.230 real 0m3.846s 00:28:53.230 user 0m10.583s 00:28:53.230 sys 0m0.642s 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1122 -- # xtrace_disable 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:28:53.230 ************************************ 00:28:53.230 END TEST bdev_bounds 00:28:53.230 ************************************ 00:28:53.230 04:29:41 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:28:53.230 04:29:41 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:28:53.230 04:29:41 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:28:53.230 04:29:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:28:53.230 ************************************ 00:28:53.230 START TEST bdev_nbd 00:28:53.230 ************************************ 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1121 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=3991160 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 3991160 /var/tmp/spdk-nbd.sock 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@827 -- # '[' -z 3991160 ']' 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@832 -- # local max_retries=100 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:28:53.230 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # xtrace_disable 00:28:53.230 04:29:41 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:28:53.230 [2024-05-15 04:29:41.213793] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:28:53.230 [2024-05-15 04:29:41.213884] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:53.488 [2024-05-15 04:29:41.297607] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:53.488 [2024-05-15 04:29:41.413983] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:28:53.488 [2024-05-15 04:29:41.435267] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:28:53.488 [2024-05-15 04:29:41.443289] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:28:53.488 [2024-05-15 04:29:41.451305] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:28:53.747 [2024-05-15 04:29:41.565455] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:28:56.312 [2024-05-15 04:29:43.948271] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:28:56.312 [2024-05-15 04:29:43.948357] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:28:56.312 [2024-05-15 04:29:43.948376] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.312 [2024-05-15 04:29:43.956289] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:28:56.312 [2024-05-15 04:29:43.956317] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:28:56.312 [2024-05-15 04:29:43.956331] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.312 [2024-05-15 04:29:43.964309] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:28:56.312 [2024-05-15 04:29:43.964336] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:28:56.312 [2024-05-15 04:29:43.964350] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.312 [2024-05-15 04:29:43.972329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:28:56.312 [2024-05-15 04:29:43.972355] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:28:56.312 [2024-05-15 04:29:43.972369] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:28:56.312 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:28:56.312 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # return 0 00:28:56.312 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:28:56.312 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:56.313 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:56.596 1+0 records in 00:28:56.596 1+0 records out 00:28:56.596 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233583 s, 17.5 MB/s 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:56.596 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:56.854 1+0 records in 00:28:56.854 1+0 records out 00:28:56.854 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232694 s, 17.6 MB/s 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:56.854 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd2 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd2 /proc/partitions 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:57.112 1+0 records in 00:28:57.112 1+0 records out 00:28:57.112 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252114 s, 16.2 MB/s 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:57.112 04:29:44 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.112 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:57.112 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:57.112 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:57.112 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:57.112 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd3 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd3 /proc/partitions 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:57.369 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:57.370 1+0 records in 00:28:57.370 1+0 records out 00:28:57.370 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228322 s, 17.9 MB/s 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:28:57.370 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd0", 00:28:57.627 "bdev_name": "crypto_ram" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd1", 00:28:57.627 "bdev_name": "crypto_ram1" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd2", 00:28:57.627 "bdev_name": "crypto_ram2" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd3", 00:28:57.627 "bdev_name": "crypto_ram3" 00:28:57.627 } 00:28:57.627 ]' 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd0", 00:28:57.627 "bdev_name": "crypto_ram" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd1", 00:28:57.627 "bdev_name": "crypto_ram1" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd2", 00:28:57.627 "bdev_name": "crypto_ram2" 00:28:57.627 }, 00:28:57.627 { 00:28:57.627 "nbd_device": "/dev/nbd3", 00:28:57.627 "bdev_name": "crypto_ram3" 00:28:57.627 } 00:28:57.627 ]' 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:28:57.627 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:57.628 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:28:57.628 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:57.628 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:28:57.628 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:57.628 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:57.885 04:29:45 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:58.143 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:58.402 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:58.659 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:58.917 04:29:46 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:28:59.174 /dev/nbd0 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:59.174 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:59.175 1+0 records in 00:28:59.175 1+0 records out 00:28:59.175 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225813 s, 18.1 MB/s 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:59.175 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:28:59.432 /dev/nbd1 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:59.433 1+0 records in 00:28:59.433 1+0 records out 00:28:59.433 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000203464 s, 20.1 MB/s 00:28:59.433 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:59.690 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:28:59.690 /dev/nbd10 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd10 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd10 /proc/partitions 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:59.948 1+0 records in 00:28:59.948 1+0 records out 00:28:59.948 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254197 s, 16.1 MB/s 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:28:59.948 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:28:59.948 /dev/nbd11 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # local nbd_name=nbd11 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@865 -- # local i 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # grep -q -w nbd11 /proc/partitions 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # break 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@881 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:00.206 1+0 records in 00:29:00.206 1+0 records out 00:29:00.206 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230233 s, 17.8 MB/s 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # size=4096 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # return 0 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.206 04:29:47 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd0", 00:29:00.465 "bdev_name": "crypto_ram" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd1", 00:29:00.465 "bdev_name": "crypto_ram1" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd10", 00:29:00.465 "bdev_name": "crypto_ram2" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd11", 00:29:00.465 "bdev_name": "crypto_ram3" 00:29:00.465 } 00:29:00.465 ]' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd0", 00:29:00.465 "bdev_name": "crypto_ram" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd1", 00:29:00.465 "bdev_name": "crypto_ram1" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd10", 00:29:00.465 "bdev_name": "crypto_ram2" 00:29:00.465 }, 00:29:00.465 { 00:29:00.465 "nbd_device": "/dev/nbd11", 00:29:00.465 "bdev_name": "crypto_ram3" 00:29:00.465 } 00:29:00.465 ]' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:29:00.465 /dev/nbd1 00:29:00.465 /dev/nbd10 00:29:00.465 /dev/nbd11' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:29:00.465 /dev/nbd1 00:29:00.465 /dev/nbd10 00:29:00.465 /dev/nbd11' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:29:00.465 256+0 records in 00:29:00.465 256+0 records out 00:29:00.465 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00500523 s, 209 MB/s 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:29:00.465 256+0 records in 00:29:00.465 256+0 records out 00:29:00.465 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0700929 s, 15.0 MB/s 00:29:00.465 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:00.466 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:29:00.466 256+0 records in 00:29:00.466 256+0 records out 00:29:00.466 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0574456 s, 18.3 MB/s 00:29:00.466 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:00.466 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:29:00.466 256+0 records in 00:29:00.466 256+0 records out 00:29:00.466 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0499385 s, 21.0 MB/s 00:29:00.466 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:29:00.466 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:29:00.722 256+0 records in 00:29:00.722 256+0 records out 00:29:00.722 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0474245 s, 22.1 MB/s 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:29:00.722 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:00.723 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:00.980 04:29:48 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:01.237 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:01.495 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:01.753 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:29:02.010 04:29:49 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:29:02.267 malloc_lvol_verify 00:29:02.267 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:29:02.525 39591569-74c1-45df-bd49-56c62316ef96 00:29:02.525 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:29:02.782 1858f275-85c0-4b03-af5e-1fb0ab6df3e3 00:29:02.782 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:29:03.040 /dev/nbd0 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:29:03.040 mke2fs 1.46.5 (30-Dec-2021) 00:29:03.040 Discarding device blocks: 0/4096 done 00:29:03.040 Creating filesystem with 4096 1k blocks and 1024 inodes 00:29:03.040 00:29:03.040 Allocating group tables: 0/1 done 00:29:03.040 Writing inode tables: 0/1 done 00:29:03.040 Creating journal (1024 blocks): done 00:29:03.040 Writing superblocks and filesystem accounting information: 0/1 done 00:29:03.040 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:03.040 04:29:50 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 3991160 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@946 -- # '[' -z 3991160 ']' 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # kill -0 3991160 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # uname 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 3991160 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@964 -- # echo 'killing process with pid 3991160' 00:29:03.298 killing process with pid 3991160 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@965 -- # kill 3991160 00:29:03.298 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@970 -- # wait 3991160 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:29:03.864 00:29:03.864 real 0m10.443s 00:29:03.864 user 0m13.902s 00:29:03.864 sys 0m3.658s 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:29:03.864 ************************************ 00:29:03.864 END TEST bdev_nbd 00:29:03.864 ************************************ 00:29:03.864 04:29:51 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:29:03.864 04:29:51 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:29:03.864 04:29:51 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:29:03.864 04:29:51 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:29:03.864 04:29:51 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:29:03.864 04:29:51 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:03.864 04:29:51 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:03.864 ************************************ 00:29:03.864 START TEST bdev_fio 00:29:03.864 ************************************ 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1121 -- # fio_test_suite '' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:03.864 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=verify 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type=AIO 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z verify ']' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' verify == verify ']' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1310 -- # cat 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1319 -- # '[' AIO == AIO ']' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # /usr/src/fio/fio --version 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1320 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1321 -- # echo serialize_overlap=1 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:29:03.864 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:03.865 ************************************ 00:29:03.865 START TEST bdev_fio_rw_verify 00:29:03.865 ************************************ 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # shift 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libasan 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:03.865 04:29:51 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:04.123 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.123 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.123 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.123 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:04.123 fio-3.35 00:29:04.123 Starting 4 threads 00:29:18.989 00:29:18.989 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3993011: Wed May 15 04:30:04 2024 00:29:18.989 read: IOPS=25.0k, BW=97.5MiB/s (102MB/s)(975MiB/10001msec) 00:29:18.989 slat (usec): min=14, max=511, avg=54.17, stdev=26.20 00:29:18.989 clat (usec): min=27, max=1223, avg=303.20, stdev=176.58 00:29:18.989 lat (usec): min=55, max=1349, avg=357.37, stdev=188.24 00:29:18.989 clat percentiles (usec): 00:29:18.989 | 50.000th=[ 258], 99.000th=[ 816], 99.900th=[ 996], 99.990th=[ 1123], 00:29:18.989 | 99.999th=[ 1221] 00:29:18.989 write: IOPS=27.4k, BW=107MiB/s (112MB/s)(1045MiB/9750msec); 0 zone resets 00:29:18.989 slat (usec): min=23, max=447, avg=65.44, stdev=26.07 00:29:18.989 clat (usec): min=20, max=1850, avg=343.32, stdev=189.12 00:29:18.989 lat (usec): min=68, max=1907, avg=408.76, stdev=199.93 00:29:18.989 clat percentiles (usec): 00:29:18.989 | 50.000th=[ 306], 99.000th=[ 889], 99.900th=[ 1074], 99.990th=[ 1221], 00:29:18.989 | 99.999th=[ 1598] 00:29:18.989 bw ( KiB/s): min=92608, max=133592, per=97.57%, avg=107065.26, stdev=2709.29, samples=76 00:29:18.989 iops : min=23152, max=33398, avg=26766.32, stdev=677.32, samples=76 00:29:18.989 lat (usec) : 50=0.02%, 100=5.98%, 250=36.46%, 500=40.13%, 750=14.55% 00:29:18.989 lat (usec) : 1000=2.68% 00:29:18.989 lat (msec) : 2=0.18% 00:29:18.989 cpu : usr=99.44%, sys=0.01%, ctx=77, majf=0, minf=366 00:29:18.989 IO depths : 1=4.4%, 2=27.3%, 4=54.6%, 8=13.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:18.989 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:18.989 complete : 0=0.0%, 4=88.0%, 8=12.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:18.989 issued rwts: total=249699,267480,0,0 short=0,0,0,0 dropped=0,0,0,0 00:29:18.989 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:18.989 00:29:18.989 Run status group 0 (all jobs): 00:29:18.989 READ: bw=97.5MiB/s (102MB/s), 97.5MiB/s-97.5MiB/s (102MB/s-102MB/s), io=975MiB (1023MB), run=10001-10001msec 00:29:18.989 WRITE: bw=107MiB/s (112MB/s), 107MiB/s-107MiB/s (112MB/s-112MB/s), io=1045MiB (1096MB), run=9750-9750msec 00:29:18.989 00:29:18.989 real 0m13.571s 00:29:18.989 user 0m43.041s 00:29:18.989 sys 0m0.564s 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:29:18.989 ************************************ 00:29:18.989 END TEST bdev_fio_rw_verify 00:29:18.989 ************************************ 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1276 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1277 -- # local workload=trim 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1278 -- # local bdev_type= 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local env_context= 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local fio_dir=/usr/src/fio 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1287 -- # '[' -z trim ']' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -n '' ']' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1297 -- # cat 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1309 -- # '[' trim == verify ']' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # '[' trim == trim ']' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo rw=trimwrite 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b041ee2b-5ed1-5097-8209-94a0fb84905b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b041ee2b-5ed1-5097-8209-94a0fb84905b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "6fe2fcad-1273-5fe2-86de-9c76b3516313"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6fe2fcad-1273-5fe2-86de-9c76b3516313",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cf785a2a-5204-5fd2-933b-519f170fc943"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cf785a2a-5204-5fd2-933b-519f170fc943",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:29:18.989 crypto_ram1 00:29:18.989 crypto_ram2 00:29:18.989 crypto_ram3 ]] 00:29:18.989 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b041ee2b-5ed1-5097-8209-94a0fb84905b"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b041ee2b-5ed1-5097-8209-94a0fb84905b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "6fe2fcad-1273-5fe2-86de-9c76b3516313"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "6fe2fcad-1273-5fe2-86de-9c76b3516313",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "1f0cb1d3-fe19-5d15-8e47-c9607d40aa0c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "cf785a2a-5204-5fd2-933b-519f170fc943"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "cf785a2a-5204-5fd2-933b-519f170fc943",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:18.990 ************************************ 00:29:18.990 START TEST bdev_fio_trim 00:29:18.990 ************************************ 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1121 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1333 -- # local fio_dir=/usr/src/fio 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1335 -- # local sanitizers 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # shift 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local asan_lib= 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libasan 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # for sanitizer in "${sanitizers[@]}" 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # grep libclang_rt.asan 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # awk '{print $3}' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # asan_lib= 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # [[ -n '' ]] 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:29:18.990 04:30:05 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1348 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:29:18.990 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:18.990 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:18.990 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:18.990 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:29:18.990 fio-3.35 00:29:18.990 Starting 4 threads 00:29:31.183 00:29:31.183 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=3995165: Wed May 15 04:30:18 2024 00:29:31.183 write: IOPS=47.9k, BW=187MiB/s (196MB/s)(1872MiB/10001msec); 0 zone resets 00:29:31.183 slat (usec): min=16, max=1466, avg=49.79, stdev=30.12 00:29:31.183 clat (usec): min=16, max=1803, avg=175.29, stdev=104.11 00:29:31.183 lat (usec): min=51, max=1853, avg=225.08, stdev=121.07 00:29:31.183 clat percentiles (usec): 00:29:31.183 | 50.000th=[ 155], 99.000th=[ 529], 99.900th=[ 627], 99.990th=[ 685], 00:29:31.183 | 99.999th=[ 906] 00:29:31.183 bw ( KiB/s): min=172544, max=235040, per=100.00%, avg=191960.11, stdev=5136.96, samples=76 00:29:31.183 iops : min=43136, max=58760, avg=47990.00, stdev=1284.23, samples=76 00:29:31.183 trim: IOPS=47.9k, BW=187MiB/s (196MB/s)(1872MiB/10001msec); 0 zone resets 00:29:31.183 slat (usec): min=5, max=128, avg=12.63, stdev= 5.35 00:29:31.183 clat (usec): min=5, max=1854, avg=215.86, stdev=125.43 00:29:31.183 lat (usec): min=13, max=1862, avg=228.50, stdev=127.06 00:29:31.183 clat percentiles (usec): 00:29:31.183 | 50.000th=[ 186], 99.000th=[ 627], 99.900th=[ 734], 99.990th=[ 799], 00:29:31.183 | 99.999th=[ 1287] 00:29:31.183 bw ( KiB/s): min=172544, max=235040, per=100.00%, avg=191960.53, stdev=5136.88, samples=76 00:29:31.183 iops : min=43136, max=58760, avg=47990.11, stdev=1284.21, samples=76 00:29:31.183 lat (usec) : 10=0.01%, 20=0.01%, 50=2.62%, 100=16.75%, 250=57.54% 00:29:31.183 lat (usec) : 500=20.11%, 750=2.94%, 1000=0.03% 00:29:31.183 lat (msec) : 2=0.01% 00:29:31.183 cpu : usr=99.46%, sys=0.00%, ctx=68, majf=0, minf=120 00:29:31.183 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:29:31.183 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:31.183 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:29:31.183 issued rwts: total=0,479113,479115,0 short=0,0,0,0 dropped=0,0,0,0 00:29:31.183 latency : target=0, window=0, percentile=100.00%, depth=8 00:29:31.183 00:29:31.183 Run status group 0 (all jobs): 00:29:31.183 WRITE: bw=187MiB/s (196MB/s), 187MiB/s-187MiB/s (196MB/s-196MB/s), io=1872MiB (1962MB), run=10001-10001msec 00:29:31.183 TRIM: bw=187MiB/s (196MB/s), 187MiB/s-187MiB/s (196MB/s-196MB/s), io=1872MiB (1962MB), run=10001-10001msec 00:29:31.183 00:29:31.183 real 0m13.535s 00:29:31.183 user 0m43.029s 00:29:31.183 sys 0m0.549s 00:29:31.183 04:30:18 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:31.183 04:30:18 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:29:31.183 ************************************ 00:29:31.183 END TEST bdev_fio_trim 00:29:31.183 ************************************ 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:29:31.183 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:29:31.183 00:29:31.183 real 0m27.363s 00:29:31.183 user 1m26.213s 00:29:31.183 sys 0m1.233s 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:29:31.183 ************************************ 00:29:31.183 END TEST bdev_fio 00:29:31.183 ************************************ 00:29:31.183 04:30:19 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:29:31.183 04:30:19 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:31.183 04:30:19 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:31.183 04:30:19 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:31.183 04:30:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:31.183 ************************************ 00:29:31.183 START TEST bdev_verify 00:29:31.183 ************************************ 00:29:31.183 04:30:19 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:29:31.183 [2024-05-15 04:30:19.128790] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:31.183 [2024-05-15 04:30:19.128875] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3996601 ] 00:29:31.441 [2024-05-15 04:30:19.210957] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:31.441 [2024-05-15 04:30:19.331137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:31.441 [2024-05-15 04:30:19.331143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.441 [2024-05-15 04:30:19.352466] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:31.441 [2024-05-15 04:30:19.360495] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:31.441 [2024-05-15 04:30:19.368510] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:31.698 [2024-05-15 04:30:19.484313] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:34.225 [2024-05-15 04:30:21.856718] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:34.225 [2024-05-15 04:30:21.856817] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:34.225 [2024-05-15 04:30:21.856848] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.225 [2024-05-15 04:30:21.864734] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:34.225 [2024-05-15 04:30:21.864763] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:34.225 [2024-05-15 04:30:21.864778] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.225 [2024-05-15 04:30:21.872755] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:34.225 [2024-05-15 04:30:21.872783] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:34.225 [2024-05-15 04:30:21.872798] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.225 [2024-05-15 04:30:21.880776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:34.225 [2024-05-15 04:30:21.880803] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:34.225 [2024-05-15 04:30:21.880818] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:34.225 Running I/O for 5 seconds... 00:29:39.485 00:29:39.485 Latency(us) 00:29:39.485 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:39.485 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x0 length 0x1000 00:29:39.485 crypto_ram : 5.06 556.26 2.17 0.00 0.00 229773.85 4757.43 148354.09 00:29:39.485 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x1000 length 0x1000 00:29:39.485 crypto_ram : 5.07 555.88 2.17 0.00 0.00 229928.46 3762.25 149130.81 00:29:39.485 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x0 length 0x1000 00:29:39.485 crypto_ram1 : 5.06 556.14 2.17 0.00 0.00 229307.86 5170.06 139033.41 00:29:39.485 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x1000 length 0x1000 00:29:39.485 crypto_ram1 : 5.07 555.77 2.17 0.00 0.00 229483.00 4150.61 139810.13 00:29:39.485 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x0 length 0x1000 00:29:39.485 crypto_ram2 : 5.05 4334.31 16.93 0.00 0.00 29343.29 5801.15 25049.32 00:29:39.485 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x1000 length 0x1000 00:29:39.485 crypto_ram2 : 5.04 4300.68 16.80 0.00 0.00 29548.08 3883.61 25243.50 00:29:39.485 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x0 length 0x1000 00:29:39.485 crypto_ram3 : 5.05 4332.96 16.93 0.00 0.00 29286.90 5558.42 24758.04 00:29:39.485 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:29:39.485 Verification LBA range: start 0x1000 length 0x1000 00:29:39.485 crypto_ram3 : 5.05 4307.82 16.83 0.00 0.00 29459.46 3422.44 25243.50 00:29:39.485 =================================================================================================================== 00:29:39.485 Total : 19499.81 76.17 0.00 0.00 52305.55 3422.44 149130.81 00:29:39.742 00:29:39.742 real 0m8.443s 00:29:39.742 user 0m15.885s 00:29:39.742 sys 0m0.504s 00:29:39.742 04:30:27 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:39.742 04:30:27 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:29:39.742 ************************************ 00:29:39.742 END TEST bdev_verify 00:29:39.742 ************************************ 00:29:39.742 04:30:27 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:39.742 04:30:27 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 16 -le 1 ']' 00:29:39.742 04:30:27 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:39.742 04:30:27 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:39.742 ************************************ 00:29:39.742 START TEST bdev_verify_big_io 00:29:39.742 ************************************ 00:29:39.743 04:30:27 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:29:39.743 [2024-05-15 04:30:27.624959] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:39.743 [2024-05-15 04:30:27.625029] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3997562 ] 00:29:39.743 [2024-05-15 04:30:27.704577] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:40.001 [2024-05-15 04:30:27.828995] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:40.001 [2024-05-15 04:30:27.829002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:40.001 [2024-05-15 04:30:27.850329] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:40.001 [2024-05-15 04:30:27.858360] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:40.001 [2024-05-15 04:30:27.866379] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:40.001 [2024-05-15 04:30:27.977256] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:42.611 [2024-05-15 04:30:30.363748] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:42.611 [2024-05-15 04:30:30.363852] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:42.611 [2024-05-15 04:30:30.363891] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.611 [2024-05-15 04:30:30.371765] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:42.611 [2024-05-15 04:30:30.371802] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:42.611 [2024-05-15 04:30:30.371818] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.611 [2024-05-15 04:30:30.379786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:42.611 [2024-05-15 04:30:30.379814] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:42.611 [2024-05-15 04:30:30.379847] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.611 [2024-05-15 04:30:30.387807] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:42.611 [2024-05-15 04:30:30.387843] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:42.611 [2024-05-15 04:30:30.387860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:42.611 Running I/O for 5 seconds... 00:29:43.550 [2024-05-15 04:30:31.196990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.197402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.197743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.198774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.201695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.201756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.201809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.201871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.202867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.205650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.205709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.205764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.205837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.206768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.209482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.209545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.209598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.209651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.210666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.213351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.213415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.213468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.213521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.213939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.214565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.217935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.218275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.218299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.218315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.218330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.220925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.220989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.221034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.221079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.221504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.550 [2024-05-15 04:30:31.221567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.221619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.221670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.222057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.222078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.222108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.222121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.224854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.224918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.224978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.225022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.225442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.225500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.225552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.225603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.226003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.226025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.226054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.226067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.228638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.228719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.228772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.228832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.229803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.232434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.232493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.232546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.232598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.232985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.233589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.236912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.237303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.237329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.237346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.237362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.239962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.240691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.241075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.241098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.241132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.241146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.243580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.243638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.243691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.243744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.244176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.244239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.244293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.551 [2024-05-15 04:30:31.244345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.244710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.244733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.244750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.244765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.247989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.248033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.248405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.248430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.248448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.248466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.250919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.250988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.251675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.252047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.252067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.252095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.252109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.254628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.254687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.254740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.254792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.255805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.258955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.259294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.259319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.259335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.259351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.261762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.261821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.261895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.261950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.262960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.265981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.266518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.269947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.270338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.270364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.270382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.270400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.272804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.272890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.272937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.552 [2024-05-15 04:30:31.272982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.273990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.276958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.277438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.279804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.279871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.279933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.279978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.280919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.283977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.284389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.284414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.284431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.284447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.286811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.286883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.286943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.286988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.287986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.288000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.290379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.290438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.290498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.290551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.290996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.291624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.293930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.293988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.294713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.295092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.295113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.295126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.295139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.296947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.553 [2024-05-15 04:30:31.297010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.297911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.300777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.301039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.301059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.301072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.301085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.302908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.302958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.303995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.306793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.307045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.307067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.307081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.307093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.308814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.308883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.308947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.308986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.309994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.313374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.314246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.315900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.317538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.319324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.319676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.320882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.323256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.324718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.326209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.327712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.328314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.328665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.554 [2024-05-15 04:30:31.329801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.332669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.334108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.335551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.336486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.337231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.337583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.337939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.338805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.339088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.339109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.339138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.339153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.342234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.343692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.344525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.344901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.345593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.345950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.346909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.348067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.348341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.348365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.348381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.348396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.351474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.352224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.352575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.352937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.353670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.354694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.355861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.357319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.357591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.357614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.357630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.357646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.360054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.360408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.360757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.361098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.362486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.363657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.365050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.366533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.366927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.366949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.366963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.366980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.368953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.369293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.369645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.369999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.371417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.372894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.374331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.374990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.375309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.375333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.375349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.375364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.377463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.377817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.378167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.555 [2024-05-15 04:30:31.379702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.381395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.382983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.383805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.384980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.385251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.385275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.385291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.385306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.387506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.387865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.389203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.390663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.392488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.393591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.394780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.396208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.396480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.396503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.396519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.396534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.399260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.400416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.401867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.403333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.404957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.406181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.407634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.409240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.409601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.409625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.409642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.409657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.412819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.414261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.415683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.416181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.417841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.419424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.420973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.421332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.421714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.421739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.421756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.421772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.425006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.426468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.426973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.428114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.429940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.431266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.431619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.432000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.432379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.432404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.432420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.432435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.435560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.436043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.437208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.438683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.440292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.440645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.441841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.443910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.445044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.446521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.556 [2024-05-15 04:30:31.448002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.448687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.449041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.449395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.449752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.450052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.450073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.450087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.450100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.452847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.454277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.455721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.456274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.457004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.457329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.457641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.459232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.459512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.459536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.459552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.459567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.462521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.463959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.464446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.464805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.465568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.465932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.467640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.469281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.469555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.469580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.469596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.469612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.472648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.473012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.473357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.473709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.474449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.475901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.477396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.478860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.479142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.479167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.479183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.479199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.480995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.481360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.481711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.482070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.483791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.485202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.486660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.487725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.488008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.488030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.488043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.488056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.489979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.490340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.490691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.491683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.493375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.494837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.495692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.497287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.497565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.497589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.497606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.497621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.499584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.499952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.501215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.502390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.504125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.504746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.506160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.507656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.507952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.507976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.507991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.508004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.510312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.511993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.513615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.515199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.516150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.517348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.518844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.520330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.557 [2024-05-15 04:30:31.520663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.520688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.520705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.520720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.524032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.525505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.526984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.527977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.529392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.530848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.532284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.532638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.533053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.533075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.533104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.533118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.536343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.537795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.538377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.539601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.541474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.542846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.543216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.543568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.543951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.543987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.544001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.544013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.547086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.547696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.548900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.550294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.551740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.552082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.552446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.552800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.553184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.553216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.553234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.553250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.555562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.556746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.558275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.559747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.560558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.560943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.558 [2024-05-15 04:30:31.561321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.561679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.562001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.562024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.562038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.562052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.820 [2024-05-15 04:30:31.565079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.566760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.568232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.569464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.570206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.570561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.570946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.572243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.572539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.572564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.572581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.572596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.575526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.575914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.576316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.576712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.577505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.577886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.578241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.578593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.578966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.578990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.579006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.579019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.581295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.581647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.582027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.582402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.583132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.583493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.583852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.584220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.584586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.584611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.584628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.584643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.586962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.587339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.587696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.588073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.588835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.589212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.589564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.589937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.590312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.590338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.590365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.590382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.592745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.593118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.593487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.593847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.594616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.594985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.595352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.595704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.596087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.596133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.596148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.596162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.598601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.598969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.599333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.599386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.600108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.600490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.600866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.601224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.601614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.601638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.601655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.601672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.603897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.604233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.604530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.604874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.604947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.605298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.605659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.606012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.606367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.606722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.607051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.607074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.607089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.607117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.821 [2024-05-15 04:30:31.609899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.610216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.610243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.610259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.610274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.612455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.612528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.612600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.612656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.613767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.615682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.615741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.615803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.615885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.616925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.618897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.618962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.619653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.620068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.620091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.620121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.620134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.622806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.623198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.623223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.623239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.623255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.625998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.626366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.626391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.626407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.626422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.628968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.629453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.631725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.631785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.631868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.631937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.822 [2024-05-15 04:30:31.632991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.633006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.633019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.634956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.635712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.636097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.636144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.636161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.636177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.638970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.639349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.639375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.639392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.639408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.641969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.642497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.644524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.644607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.644665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.644719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.645851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.647916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.647982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.648632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.649013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.649035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.649063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.649077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.651842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.652258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.652283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.652300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.652316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.823 [2024-05-15 04:30:31.654968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.655014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.655368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.655394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.655410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.655426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.657959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.658004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.658381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.658411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.658429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.658445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.660905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.661251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.661275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.661291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.661306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.662864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.662937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.663964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.665604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.665667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.665721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.665773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.666785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.668928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.669177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.669201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.669218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.669233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.670987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.671784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.672062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.672084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.672098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.672110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.673593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.673655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.673710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.673763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.824 [2024-05-15 04:30:31.674509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.674533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.674549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.674564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.676575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.676633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.676686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.676739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.677574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.679768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.680089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.680111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.680139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.680151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.682987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.683001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.683013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.684525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.684583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.684641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.684692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.685682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.687989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.825 [2024-05-15 04:30:31.688041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.688369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.688392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.688408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.688423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.689852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.689920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.689983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.690682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.691059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.691080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.691109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.691122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.692717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.692774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.692838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.692907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.693704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.695986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.696029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.696385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.696410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.696432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.696449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.697899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.697962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.698539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.698597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.698911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.698968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.699433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.701224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.701282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.701334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.701679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.701961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.702499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.705326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.706784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.707120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.707482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.707846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.708192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.708747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.709934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.711397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.711663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.711687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.711703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.711718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.714570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.714952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.715294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.715639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.716024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.716654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.717846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.719333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.720793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.721136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.721163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.721179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.721194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.826 [2024-05-15 04:30:31.722916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.723263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.723611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.723964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.724263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.725422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.726887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.728347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.728923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.729187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.729212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.729228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.729243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.731077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.731445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.731794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.733057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.733350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.734820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.736309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.736846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.738003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.738284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.738308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.738324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.738340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.740323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.740675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.742187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.743605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.743895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.745327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.745965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.747181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.748661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.748945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.748967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.748981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.748998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.751219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.752623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.754088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.755561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.755849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.756963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.758154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.759628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.761089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.761438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.761463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.761480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.761496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.764456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.765857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.767335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.768100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.768383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.769697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.771167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.772816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.773176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.773567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.773593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.773610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.773626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.776726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.778191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.778835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.780142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.780427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.782057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.783520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.783881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.784228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.784614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.784639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.784654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.784670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.787678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.788195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.789381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.790844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.791092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.792449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.792799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.827 [2024-05-15 04:30:31.793965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.795794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.796993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.798457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.799919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.800229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.800588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.800951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.801981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.804560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.806008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.807473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.808351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.808752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.809096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.809458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.809808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.811422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.811691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.811715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.811731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.811746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.814621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.816050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.816764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.817102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.817500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.817865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.818208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.819700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.821232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.821501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.821525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.821541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.821561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.824478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.825029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.825388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.825738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.826124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.826490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.827874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.829387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.830882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.831176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.831201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.831218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:43.828 [2024-05-15 04:30:31.831233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.833319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.833678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.834043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.834408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.834782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.836227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.837740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.839176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.840303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.840576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.840600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.840616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.840631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.842412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.842764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.843099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.843766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.844070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.845584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.847044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.848026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.849649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.849942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.849964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.849977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.849990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.851976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.852352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.853337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.854506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.854779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.856248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.856985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.858497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.860109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.860392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.860416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.860433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.860448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.862560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.863705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.864848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.866259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.866533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.867202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.868629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.870129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.871590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.871867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.871906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.871920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.871933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.875133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.092 [2024-05-15 04:30:31.876371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.877798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.879384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.879740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.880945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.882377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.883831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.884787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.885177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.885215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.885232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.885249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.888361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.889960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.891526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.892561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.892863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.894330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.895788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.896476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.896833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.897195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.897221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.897237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.897252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.900443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.901739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.903030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.904251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.904522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.905970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.906429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.906779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.907113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.907540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.907565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.907582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.907598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.910300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.911654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.912918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.914379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.914651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.915073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.915451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.915771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.916105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.916466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.916491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.916506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.916521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.919503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.920972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.921994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.922358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.922745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.923093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.923455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.923807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.924156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.924534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.924558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.924574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.924590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.926829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.927174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.927537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.927893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.928276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.928635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.928992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.929338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.929689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.930062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.930083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.930096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.930124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.932303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.932660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.933013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.933389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.933708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.934057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.934413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.934762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.935099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.935436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.093 [2024-05-15 04:30:31.935462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.935478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.935493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.937734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.938074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.938441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.938793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.939169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.939538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.939907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.940237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.940590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.941000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.941021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.941034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.941046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.943279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.943632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.944002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.944366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.944756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.945092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.945454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.945807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.946154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.946522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.946545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.946561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.946576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.948761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.949102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.949467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.949817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.950142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.950503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.950863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.951945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.954158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.954532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.954907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.955287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.955677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.956028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.956370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.956721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.957066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.957451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.957476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.957492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.957507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.959716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.960062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.960420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.960769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.961151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.961527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.961905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.962236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.962587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.962974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.963011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.963024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.963037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.965221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.965578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.965919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.966272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.966614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.966982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.967332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.967682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.968041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.968369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.968394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.968410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.968425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.970650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.971006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.971058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.971419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.971814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.972169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.972518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.972875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.973217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.973570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.973601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.973619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.973634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.976103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.094 [2024-05-15 04:30:31.976472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.976842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.976906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.977273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.977632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.977987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.978368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.978726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.979146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.979172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.979189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.979206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.981842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.982202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.982228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.982244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.982259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.984949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.985352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.985377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.985394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.985410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.987991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.988036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.988413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.988438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.988455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.988470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.990387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.990444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.990496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.990549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.990944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.991643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.993631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.993689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.993742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.993796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.994772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.996668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.996727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.996779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.996839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.997757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.999647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.999704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.999763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:31.999817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.095 [2024-05-15 04:30:32.000151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.000795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.002655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.002713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.002774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.002835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.003652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.005988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.006350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.006376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.006393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.006410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.008799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.009119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.009162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.009178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.009193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.010635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.010694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.010753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.010812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.011839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.013953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.014426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.015987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.016052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.016114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.016168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.096 [2024-05-15 04:30:32.016538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.016605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.016659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.016711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.016763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.017132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.017153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.017170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.017202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.018621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.018679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.018735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.018789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.019682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.021348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.021407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.021459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.021518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.021976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.022578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.024852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.025117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.025138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.025152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.025181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.027818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.028106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.028145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.028161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.028176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.029675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.029735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.029787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.029847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.030704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.032574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.032634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.032687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.032744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.033577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.035061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.035144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.097 [2024-05-15 04:30:32.035204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.035701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.036035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.036058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.036072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.036089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.037859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.037924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.037989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.038793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.040983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.041030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.041420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.041445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.041462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.041478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.043955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.044002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.044295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.044318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.044349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.044364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.045830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.045913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.045966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.046978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.047000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.047015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.047029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.048703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.048761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.048834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.048908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.049803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.051991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.052637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.098 [2024-05-15 04:30:32.054810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.054883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.054931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.055213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.055238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.055254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.055284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.056941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.057699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.058071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.058093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.058106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.058134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.059592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.059674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.059730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.059782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.060686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.062507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.062566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.062618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.062670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.063649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.065127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.065186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.066993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.067299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.067324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.067340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.067356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.069467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.069527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.069579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.070745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.071594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.073933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.074292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.074641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.075005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.075400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.076724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.077869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.079335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.080857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.081259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.081284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.081300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.081315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.083063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.083437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.083786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.084141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.084429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.085607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.087052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.088581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.089289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.089591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.099 [2024-05-15 04:30:32.089615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.089632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.089646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.091634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.091995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.092350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.093949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.094204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.095627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.097213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.097981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.099149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.099427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.099452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.099467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.099483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.101642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.100 [2024-05-15 04:30:32.102023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.103635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.105218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.105497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.107109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.107917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.109098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.110572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.110852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.110892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.110907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.110920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.113205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.114629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.116190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.117689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.118034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.119488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.120760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.122207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.123880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.124219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.124244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.124260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.124275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.127222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.128672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.130113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.130863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.131153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.132497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.133999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.135682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.136044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.136436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.136461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.136478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.136494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.139687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.141146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.141720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.143096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.143384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.145032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.146519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.146880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.147230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.147616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.147640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.147657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.147672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.150723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.151197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.152421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.153869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.154160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.155512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.155887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.156258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.156618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.157005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.157028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.157043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.157057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.158995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.160171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.161616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.163076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.163418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.362 [2024-05-15 04:30:32.163781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.164135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.164489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.164847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.165104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.165125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.165139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.165153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.167770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.169218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.170678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.171388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.171785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.172149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.172504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.172862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.174513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.174788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.174813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.174837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.174854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.177789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.179288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.179910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.180257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.180648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.181015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.181374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.182816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.184320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.184594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.184618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.184634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.184649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.187585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.188095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.188450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.188805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.189237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.189595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.191104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.192699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.194155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.194428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.194454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.194470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.194485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.196277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.196632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.197002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.197371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.197729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.198904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.200322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.201775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.202689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.202977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.202999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.203013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.203026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.204847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.205189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.205563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.206188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.206508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.207983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.209451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.210503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.212030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.212348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.212378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.212396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.212411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.214430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.214785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.215894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.217050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.217343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.218800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.219459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.220917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.222459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.222732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.222757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.222773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.222788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.224908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.225901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.227070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.228413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.228783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.230289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.231666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.233083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.233451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.363 [2024-05-15 04:30:32.233873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.233895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.233911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.233939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.237185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.238666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.239105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.240726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.241007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.242511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.244172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.244529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.244905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.245308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.245333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.245349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.245364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.247608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.247974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.248319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.248671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.249048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.249422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.249776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.250960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.253240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.253598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.253983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.254353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.254763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.255107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.255484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.255849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.256205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.256571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.256593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.256622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.256635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.258928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.259293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.259650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.260024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.260452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.260814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.261175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.261531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.261906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.262276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.262302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.262319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.262336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.264667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.265024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.265386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.265739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.266081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.266449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.266803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.267974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.270115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.270487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.270846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.271211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.271577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.271950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.272329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.272679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.273033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.273394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.273419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.273436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.273451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.275720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.276070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.276436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.276785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.277129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.277489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.277852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.278206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.278560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.278985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.279008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.279022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.364 [2024-05-15 04:30:32.279036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.281279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.281636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.281997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.282349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.282709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.283067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.283432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.283788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.284125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.284547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.284571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.284589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.284605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.286891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.287244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.287598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.287969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.288323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.288683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.289036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.289402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.289756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.290106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.290144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.290157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.290171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.292415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.292775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.293115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.293483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.293896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.294238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.294595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.294963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.295312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.295673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.295699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.295716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.295732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.298052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.298415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.298770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.299112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.299524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.299910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.300247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.300602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.300966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.301311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.301338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.301355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.301371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.303509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.303890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.304210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.304566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.304959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.305325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.305678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.306799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.309024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.309383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.309740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.310084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.310473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.310841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.311176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.311545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.311923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.312247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.312275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.312292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.312309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.314478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.314843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.315188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.315541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.315944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.316294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.316654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.317831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.321048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.321413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.365 [2024-05-15 04:30:32.321474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.321838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.322171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.322534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.322910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.323248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.323604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.323996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.324021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.324037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.324053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.327153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.328658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.329271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.329332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.329619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.331214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.332683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.333876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.334202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.334588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.334614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.334632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.334651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.336961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.337538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.339846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.340189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.340215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.340232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.340250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.341872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.341942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.341988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.342981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.343000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.344610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.344671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.344726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.344782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.345785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.347288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.347348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.347402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.366 [2024-05-15 04:30:32.347459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.347735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.347812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.347893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.347941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.347987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.348255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.348281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.348298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.348315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.350931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.351185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.351212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.351230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.351247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.352786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.352859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.352927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.352975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.353800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.355752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.355813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.355893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.355941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.356781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.358976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.359024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.359351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.359378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.359395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.359412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.361992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.362254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.362280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.362298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.362315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.363896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.363949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.363995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.364616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.365004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.365029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.365045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.365060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.366790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.366859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.366928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.366981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.367249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.367318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.367376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.367 [2024-05-15 04:30:32.367431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.367485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.367858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.367899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.367914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.367929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.369415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.369475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.369530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.369594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.370831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.372535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.372607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.372665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.372719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.368 [2024-05-15 04:30:32.373648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.375986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.376607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.378724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.379000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.379024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.379039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.379053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.380959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.381746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.382019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.382043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.382059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.382074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.383575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.383640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.383695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.383753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.384590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.386666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.386728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.386783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.630 [2024-05-15 04:30:32.386846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.387708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.389899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.390217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.390244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.390262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.390278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.392954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.393199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.393238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.393255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.393272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.394753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.394817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.394895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.394943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.395979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.397783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.397850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.397916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.397963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.398848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.400997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.401043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.401401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.401428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.401445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.401461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.403866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.404115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.404142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.404159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.404177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.405695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.405754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.405808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.405873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.406249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.406316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.631 [2024-05-15 04:30:32.406377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.406952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.408477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.408536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.408590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.408647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.408977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.409548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.411966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.412014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.412342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.412368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.412385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.412401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.413997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.414726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.415001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.415024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.415044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.415058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.416910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.416961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.417998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.418022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.418036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.418050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.419656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.419716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.421743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.422097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.422135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.422153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.422170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.424057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.424111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.424177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.425634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.425924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.425986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.426654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.428488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.428852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.429184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.429551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.429839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.430996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.432426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.433878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.434421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.434715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.632 [2024-05-15 04:30:32.434741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.434758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.434775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.436743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.437109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.437463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.438983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.439274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.440744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.442203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.442920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.444066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.444346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.444372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.444390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.444406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.446641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.447003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.448503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.450067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.450349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.451957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.452938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.454071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.455520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.455793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.455820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.455850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.455882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.458097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.459291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.460751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.462211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.462526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.463838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.465033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.466439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.468030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.468403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.468431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.468453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.468471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.471483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.472933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.474371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.474889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.475151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.476838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.478285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.479514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.479893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.480261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.480290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.480308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.480326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.483522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.485051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.485845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.487004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.487274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.488744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.489686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.490853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.493970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.494983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.496142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.497603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.497901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.498524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.498904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.499238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.499594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.499988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.500012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.500026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.500040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.502621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.503803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.505269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.506749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.507148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.507515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.507894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.508227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.508798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.509077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.509101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.509115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.509147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.511932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.633 [2024-05-15 04:30:32.513384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.515015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.515374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.515772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.516133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.516472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.517210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.518358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.518631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.518656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.518673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.518691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.521871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.523342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.523698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.524047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.524387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.524754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.525694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.526837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.528269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.528541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.528567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.528584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.528600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.531434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.531792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.532142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.532499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.532901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.533463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.534938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.536385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.537016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.537290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.537318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.537335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.537357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.539362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.539721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.540638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.541798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.542066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.543533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.544307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.545701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.547010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.547280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.547306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.547323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.547340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.549486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.549853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.550199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.550555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.550959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.551325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.551681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.552821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.555093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.555468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.555833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.556177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.556621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.556988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.557342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.557701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.558055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.558456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.558485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.558503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.558519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.560939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.634 [2024-05-15 04:30:32.561279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.561639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.562001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.562344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.562707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.563058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.563419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.563775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.564153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.564181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.564202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.564220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.566459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.566808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.567162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.567519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.567955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.568315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.568671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.569818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.572136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.572508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.572900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.573247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.573581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.573957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.574315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.574676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.575037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.575465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.575492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.575509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.575527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.577831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.578195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.578549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.578911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.579308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.579672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.580034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.580408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.580761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.581155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.581183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.581200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.581217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.583496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.583862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.584228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.584585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.584935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.585300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.585652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.586792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.589645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.590250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.590605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.590974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.591310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.592162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.592947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.593299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.594472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.594838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.594891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.594906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.594920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.597163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.597536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.598985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.599348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.599739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.600115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.600484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.601790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.602157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.602567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.602597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.602614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.602631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.604790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.606265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.606627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.606999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.635 [2024-05-15 04:30:32.607295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.607634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.608002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.608373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.608731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.609021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.609045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.609060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.609074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.611452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.611809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.612176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.613812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.614218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.614580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.616238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.616585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.616960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.617289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.617316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.617333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.617350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.620589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.620964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.621309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.621666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.622030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.623708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.624061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.624422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.625870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.626241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.626269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.626287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.626304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.628432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.628908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.630076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.630440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.630804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.631191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.631550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.632903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.633239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.633603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.633629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.633646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.633662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.635971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.637060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.637413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.638180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.638552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.638936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.639289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.639648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.640194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.640528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.640558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.640575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.636 [2024-05-15 04:30:32.640591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.642908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.643266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.643625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.645029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.645430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.645791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.647262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.647616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.647974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.648220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.648260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.648277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.648293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.650233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.651103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.651896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.652226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.652580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.652961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.653307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.653665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.654055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.654516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.654557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.654589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.654614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.656764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.657101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.658131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.898 [2024-05-15 04:30:32.659321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.659596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.661066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.661713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.663088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.664565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.664848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.664887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.664901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.664914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.667062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.668381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.669599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.671053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.671343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.671806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.672995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.674421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.675910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.676211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.676238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.676257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.676272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.679995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.681488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.681553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.683012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.683305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.684415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.685584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.687033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.688422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.688772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.688798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.688814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.688842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.691991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.693411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.694852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.694930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.695272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.696737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.698253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.699723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.700793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.701136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.701159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.701173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.701187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.703813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.704082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.704104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.704137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.704154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.705734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.705793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.705857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.705925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.706857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.708870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.708925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.708976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.709927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.711533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.711593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.711646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.711699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.712080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.899 [2024-05-15 04:30:32.712160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.712807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.714417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.714478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.714532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.714588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.714937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.715521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.717956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.718521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.720885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.721148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.721175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.721193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.721209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.723949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.724000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.724322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.724348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.724364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.724380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.725988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.726684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.727087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.727125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.727140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.727153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.729958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.730013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.730295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.730322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.730340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.730357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.731955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.732676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.733083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.733125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.900 [2024-05-15 04:30:32.733143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.733158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.735740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.736054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.736079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.736098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.736128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.737692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.737753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.737807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.737892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.738976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.740674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.740733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.740793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.740855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.741741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.743453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.743513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.743572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.743627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.743977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.744672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.746900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.747145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.747181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.747199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.747215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.751844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.752102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.752146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.752164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.752181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.756955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.757593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.761161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.901 [2024-05-15 04:30:32.761224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.761861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.762106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.762146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.762164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.762186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.764621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.764682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.764740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.764794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.765661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.770454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.770515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.770569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.770629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.771678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.774812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.774894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.774942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.774993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.775939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.779613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.779677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.779731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.779785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.780703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.784978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.785408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.789606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.789671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.789726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.789783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.790814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.794212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.794277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.794333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.794386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.902 [2024-05-15 04:30:32.794677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.794745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.794801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.794864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.794936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.795180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.795218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.795235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.795257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.799793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.800062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.800085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.800099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.800112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.803954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.804731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.805099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.805123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.805140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.805156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.809725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.809785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.809848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.809923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.810787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.813947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.814230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.814256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.814274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.814290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.817639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.817700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.818045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.818096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.903 [2024-05-15 04:30:32.818424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.818997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.819010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.820564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.820627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.820686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.822642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.823034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.823057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.823072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.823087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.826325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.827793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.828373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.829732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.830012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.831635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.833001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.833347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.833702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.834068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.834093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.834108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.834139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.837253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.838041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.839203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.840674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.840957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.841871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.842204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.842559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.842931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.843305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.843332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.843350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.843367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.846099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.847289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.848755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.850335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.850690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.851047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.851410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.851764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.852545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.852854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.852893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.852908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.852921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.856024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.857431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.858782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.859115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.859525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.859911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.860246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.904 [2024-05-15 04:30:32.861457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.862608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.862905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.862929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.862943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.862957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.865940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.866867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.867199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.867553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.867965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.868325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.869973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.871543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.873178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.873522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.873549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.873567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.873584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.875467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.875832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.877479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.877842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.878215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.879752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.881339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.882797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.883960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.884221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.884248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.884265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.884283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.886207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.886562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.886931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.887900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.888184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.889684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.891167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.891721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.893355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.893630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.893655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.893672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.893689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.895782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.896138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.896497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.896877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.897210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.897573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.897942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.898280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.898638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.898980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.899004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.899019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.899034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.901391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.901756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.902098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.902470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.902901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.903238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.903593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.903960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.904305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.904688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.905 [2024-05-15 04:30:32.904715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.904734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.904751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.907215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.907584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.907949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.908300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.908720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.909083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.909457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.909811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:44.906 [2024-05-15 04:30:32.910195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.910614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.910644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.910661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.910679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.912986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.913338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.913696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.914048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.914420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.914789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.915120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.915483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.915847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.916143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.916183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.916202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.916220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.918697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.919056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.919423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.919775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.920187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.920549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.920924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.921268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.921630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.922016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.922040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.922055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.922070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.924511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.924891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.925227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.925582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.926006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.926371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.926727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.927906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.930170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.930543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.930929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.931270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.931684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.932039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.932400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.932754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.933101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.933447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.933474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.933491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.933508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.935830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.936171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.936537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.167 [2024-05-15 04:30:32.936911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.937295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.937661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.938020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.938376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.938730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.939097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.939137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.939154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.939172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.941433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.941798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.942130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.942498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.942882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.943226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.943581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.943943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.944300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.944683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.944709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.944726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.944741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.946976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.947324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.947681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.948035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.948444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.948806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.949156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.949511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.949877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.950238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.950265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.950282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.950300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.952610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.952974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.953325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.953679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.954094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.954465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.954844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.955192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.955546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.955954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.955977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.955991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.956005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.958254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.958611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.958975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.959322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.959744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.960096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.960460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.960815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.961144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.961536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.961564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.961580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.961597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.168 [2024-05-15 04:30:32.984116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.168 [2024-05-15 04:30:32.984251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.985674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.985759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.986071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.986482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.989589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.990995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.992045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.993495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.993903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.995249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.996652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.997031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.997394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.997775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:32.997817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.000834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.002080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.003358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.004485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.006074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.006834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.007190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.007552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.007987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.008022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.011091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.012039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.013168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.014548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.015890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.016215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.016594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.016970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.017358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.017396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.019428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.020552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.021944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.023321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.023984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.024343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.024706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.025060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.025359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.025391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.027941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.029317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.030703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.031706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.032437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.032797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.033157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.034598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.034938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.034968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.038017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.039425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.040735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.041082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.041831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.042183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.043289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.044420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.169 [2024-05-15 04:30:33.044707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.044739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.047813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.049409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.049777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.050132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.050833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.051694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.052828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.054214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.054502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.054534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.057360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.057870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.058993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.059349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.060984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.061340] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.062162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.063305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.063591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.063623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.066479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.066919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.067262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.067630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.068380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.069678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.071095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.072406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.072718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.072750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.074571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.074947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.075294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.075656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.076357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.076729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.077073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.077438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.077839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.077888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.080141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.080506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.080895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.081240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.081994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.082371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.082739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.083092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.083453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.083488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.085788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.086129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.086504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.086887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.087598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.087966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.088327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.088690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.089076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.089108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.091277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.170 [2024-05-15 04:30:33.091641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.092004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.092364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.093034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.093397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.093767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.094112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.094529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.094564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.096714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.097066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.097443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.097811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.098521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.098903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.099275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.099641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.099997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.100027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.102295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.102663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.103029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.103396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.104109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.104482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.104873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.105214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.105614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.105650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.107814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.108172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.108535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.108918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.109571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.109946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.110301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.110662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.111051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.111082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.113260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.113623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.113992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.114358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.115064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.115442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.115805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.116175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.116507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.116541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.118735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.119087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.119461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.119832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.120541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.120923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.121267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.121631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.122023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.122055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.124240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.124602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.124974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.125331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.126017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.171 [2024-05-15 04:30:33.126380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.126748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.127095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.127500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.127535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.129666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.130028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.130394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.130761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.131453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.131815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.132174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.132539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.132911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.132941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.135135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.135517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.135898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.136244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.136963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.137322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.137690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.138061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.138457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.138491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.140659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.141024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.141385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.141749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.142460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.142836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.143189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.143557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.143977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.144009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.146208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.146574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.147007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.148198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.148900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.149264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.149629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.149995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.150363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.150396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.152230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.152596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.152967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.153317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.154813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.156198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.157698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.158359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.158669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.158702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.160564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.160940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.161290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.162885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.164546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.165934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.166367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.167602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.167902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.167936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.172 [2024-05-15 04:30:33.169913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.170259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.171645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.172904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.174546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.175101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.176547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.178009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.178325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.173 [2024-05-15 04:30:33.178375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.180601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.182040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.183331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.184746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.185532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.186857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.188260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.188331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.188617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.188649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.190901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.190964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.192642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.194143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.195841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.196377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.197484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.198882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.199141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.199173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.201356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.202748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.203784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.205159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.206286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.207898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.209322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.210890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.211158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.211192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.213762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.214891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.216246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.217627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.218512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.218798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.220002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.221396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.222828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.223180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.223248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.223646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.223682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.223709] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.226785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.435 [2024-05-15 04:30:33.226874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.228237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.228305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.228741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.230029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.230088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.231493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.231562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.231853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.231901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.231923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.234007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.234068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.234138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.234204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.234491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.235626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.235694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.237061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.237141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.237423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.237455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.237482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.238963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.239762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.240122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.240171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.240195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.241860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.241934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.241985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.242932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.244429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.244496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.244562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.244626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.245742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.247940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.248196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.248229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.248253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.250856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.251114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.436 [2024-05-15 04:30:33.251162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.251187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.252664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.252731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.252796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.252869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.253732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.255633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.255708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.255779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.255850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256630] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.256677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.258810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.259170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.259204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.259232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.261794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.262058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.262087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.262109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.263572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.263638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.263710] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.263780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.264873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.266478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.266545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.266611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.266672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.266961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.267543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.269059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.269115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.269201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.269266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.437 [2024-05-15 04:30:33.269661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.269745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.269809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.269892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.269945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.270323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.270360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.270389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.271821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.271909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.271962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.272918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.274666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.274738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.274801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.274871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.275867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.277986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.278037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.278318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.278352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.278376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.280950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.281001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.281052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.281331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.281365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.281388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.282921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.282976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.283649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.284029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.284058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.284081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.285943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.286980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.287001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.288442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.288508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.288577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.438 [2024-05-15 04:30:33.288641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.288995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.289718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.291396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.291462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.291523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.291584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.291962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.292614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.294362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.294429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.294502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.294564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.294977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.295576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.297780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.298046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.298075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.298115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.300777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.301049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.301078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.301097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.302626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.302692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.302754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.302816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.303850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.305651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.305720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.305782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.305851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.306740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.439 [2024-05-15 04:30:33.308896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.308950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.309005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.309339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.309374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.309399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.311654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.313212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.313283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.313562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.313595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.313620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.315560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.315628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.315695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.315759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.316771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.318973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.319281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.319316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.319342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.321955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.322008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.322060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.323624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.323924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.323954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.323975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.325487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.326244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.326320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.326679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.327049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.327129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.327496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.327565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.328650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.328956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.328985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.329005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.333404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.334920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.335256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.335615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.335969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.336032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.440 [2024-05-15 04:30:33.336386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.336458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.338000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.338274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.338307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.338331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.341115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.342500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.343487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.343811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.344184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.344491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.344818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.346490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.347956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.348218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.348252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.348276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.352986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.353339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.353698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.354040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.354309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.355441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.356814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.358211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.358762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.359055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.359084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.359104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.360975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.361332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.361693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.362873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.363137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.364545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.365060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.366371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.367767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.368069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.368104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.368128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.371994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.373242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.374363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.374909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.375269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.375639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.376002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.376357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.376720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.377087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.377137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.377162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.379601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.379972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.380329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.380689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.381091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.381472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.381843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.382184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.382552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.382957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.382986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.383007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.385904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.386238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.386604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.386976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.387384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.387754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.388099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.388477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.388850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.389165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.389209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.389235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.441 [2024-05-15 04:30:33.391511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.391901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.392236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.392598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.393012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.393381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.393750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.394090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.394463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.394888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.394916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.394937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.397820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.398187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.398554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.398930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.399327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.399697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.400052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.400423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.400799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.401109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.401157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.401182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.403451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.403816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.404164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.404525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.404957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.405305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.405668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.406020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.406374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.406779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.406810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.406842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.409765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.410104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.410484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.410853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.411283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.411654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.412015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.412371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.412742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.413095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.413140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.413174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.415402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.415779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.416109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.416477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.416892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.417236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.417616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.417992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.418350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.418720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.418754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.418779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.421567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.421942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.423372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.424686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.425010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.426422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.427560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.427937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.428280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.428688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.428721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.428748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.430897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.431228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.431590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.431966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.432290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.432661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.442 [2024-05-15 04:30:33.433018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.433375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.433739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.434080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.434126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.434152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.437142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.437508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.437876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.438225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.443 [2024-05-15 04:30:33.438602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:45.701 [2024-05-15 04:30:33.484458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.485464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.485538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.485869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.490916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.492456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.492811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.493130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.493552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.493900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.493962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.495062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.495138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.496376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.497850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.498094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.498133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.498162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.500743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.502095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.502460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.502811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.503428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.503782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.505380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.507016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.507283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.507311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.507329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.511944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.512276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.512629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.512980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.514581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.516157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.517598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.518723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.519021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.519045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.519060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.520856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.521179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.522601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.523018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.524462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.525935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.527373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.528038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.528314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.528343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.528361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.532191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.533151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.534317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.535782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.536902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.538459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.540084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.541587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.541868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.541909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.541925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.544249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.544604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.545995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.547260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.548967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.549429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.550666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.552089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.552378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.552405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.552423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.556655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.558077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.559576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.560304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.562083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.563509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.564564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.565107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.565510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.565538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.565557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.568612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.570075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.570598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.571762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.573544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.574868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.575203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.575556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.575950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.575974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.575988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.580508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.581963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.583432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.584346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.585177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.585531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.586952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.587318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.587716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.587743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.587760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.589842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.591002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.592473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.593187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.593949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.594278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.594631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.596282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.596558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.596584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.596601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.599548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.600735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.601075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.602082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.603194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.604216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.605230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.605584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.605977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.606001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.606017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.608980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.609434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.609791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.611300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.612040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.612396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.612752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.613087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.613499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.613526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.613544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.616224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.617514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.617889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.618208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.618904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.619246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.619599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.619959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.620330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.620360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.620378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.622372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.623861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.624200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.701 [2024-05-15 04:30:33.624555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.625218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.625573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.625933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.626261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.626646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.626673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.626691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.629674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.630024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.630377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.630735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.631445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.631800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.632140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.632498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.632831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.632877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.632894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.635050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.635409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.635769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.636104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.636844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.637170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.637523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.637895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.638189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.638216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.638234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.642971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.643315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.643669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.644018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.644735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.645074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.645440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.646556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.646951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.646974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.646988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.648796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.648885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.648926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.648980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.649978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.650316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.650721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.650749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.650767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.650784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.656128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.656495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.656789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.657091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.657382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.657723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.658063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.658422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.658779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.659184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.659225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.659243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.659259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.662464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.662819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.663159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.663529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.663909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.664247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.664600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.664961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.665301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.665689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.665717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.665734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.665751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.668805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.669173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.669529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.669900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.670236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.670597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.670961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.671303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.671735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.672007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.672032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.672047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.672062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.674216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.674579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.674947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.675281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.675628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.675990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.676336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.676693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.677211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.677489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.677517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.677535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.677552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.680173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.680533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.680905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.681230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.681599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.681971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.682363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.683575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.683941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.684266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.684293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.702 [2024-05-15 04:30:33.684310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.684326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.686408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.686762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.687090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.687450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.687775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.689084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.689972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.690695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.692129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.692528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.692556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.692575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.692593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.695308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.695664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.695725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.696056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.696416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.696791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.697136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.698659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.699010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.699375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.699403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.699421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.699438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.702767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.703494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.704381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.704737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.705036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.705885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.706210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.707976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.710687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.711516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.711576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.712120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.712518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.713768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.713841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.715262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.703 [2024-05-15 04:30:33.715341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.715660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.715696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.715715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.715731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.717313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.718031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.718086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.719429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.719846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.720213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.720274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.721837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.726186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.727682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.727743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.729225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.729539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.730797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.730884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.731715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.733461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.734946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.734998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.736445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.736875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.738209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.738261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.739609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.739669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.739952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.739976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.739991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.740005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.742683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.743191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.743253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.744431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.744704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.746210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.746271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.746957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.747012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.747301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.747327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.747344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.747359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.748883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.967 [2024-05-15 04:30:33.749361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.749420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.749475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.749876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.751589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.751659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.752463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.755821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.755923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.755971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.756017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.756287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.757853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.759632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.759697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.759756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.759835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.760702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.765960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.766344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.766382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.766400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.766418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.768798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.769061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.769086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.769101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.769132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.772931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.772984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.773744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.774031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.774056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.774071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.774085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.775907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.775965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.776988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.968 [2024-05-15 04:30:33.780955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.781620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.783960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.784260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.784286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.784304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.784321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.788775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.788844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.788908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.788954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.789985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.791544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.791612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.791683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.791726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.792635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.796914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.797189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.797209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.797223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.797236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.798624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.798693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.798743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.798792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.799558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.803839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.804123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.804148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.804166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.804181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.805717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.805779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.805841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.805909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.969 [2024-05-15 04:30:33.806813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.806839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.810786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.811065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.811089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.811119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.811133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.812702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.812762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.812819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.812897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.813876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.816130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.816198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.816258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.817774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.818084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.818180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.818244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.818290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.819946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.820232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.820257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.820274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.820289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.822139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.822952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.823796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.823870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.824272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.825516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.826706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.828585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.832983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.833532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.835959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.836249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.836274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.836290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.836306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.840760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.840819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.840897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.840945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.841993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.843505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.843563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.970 [2024-05-15 04:30:33.843619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.843672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.843975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.844526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.847533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.847592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.847645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.847700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.848634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.850840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.851142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.851167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.851184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.851200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.856521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.856588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.857563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.857626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.857922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.858503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.859971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.860664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.861029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.861429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.861456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.861473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.861489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.867818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.867898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.869044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.869094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.869406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.869474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.870971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.871024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.872440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.872754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.872778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.872794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.872809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.874891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.874956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.971 [2024-05-15 04:30:33.876589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.876648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.876936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.876995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.878417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.878477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.879415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.879695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.879720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.879736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.879752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.884031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.884098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.884934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.884999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.885373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.885438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.886977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.887045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.888630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.888932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.888954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.888968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.888981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.894626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.894690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.895035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.895098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.895447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.895514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.896444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.896503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.896872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.897113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.897133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.897147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.897160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.902002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.902069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.903838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.904365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.904642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.904666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.904683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.904698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.909292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.909359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.910909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.912502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.912778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.912854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.913742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.913801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.914291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.914672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.914697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.914713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.914732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.920835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.921620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.922808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.924257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.924532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.924606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.925308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.926941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.927295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.927684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.927708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.927724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.927739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.932796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.934298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.935657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.937216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.937493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.937932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.939114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.939483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.940247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.940561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.940585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.940601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.940617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.945450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.947035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.948498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.949721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.950021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.950807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.951148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.952329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.972 [2024-05-15 04:30:33.952776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.953182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.953209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.953226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.953242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.958065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.958782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.959711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.960053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.960368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.961292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.961644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.962891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.964048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.964324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.964348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.964364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.964379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.969656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.970398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.970749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.971993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.972372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.972773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.974479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:45.973 [2024-05-15 04:30:33.976091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.977737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.978020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.978044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.978058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.978071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.982473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.982842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.984427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.984781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.985170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.986821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.988459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.233 [2024-05-15 04:30:33.989989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.990948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.991318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.991347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.991363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.991378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.994396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.995956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.997276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.997656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.997945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.999410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:33.999804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.001830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.006341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.006699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.007724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.008350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.008744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.009095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.009460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.010317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.011039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.011452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.011477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.011494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.011510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.015626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.015982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.017204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.017593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.017992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.018377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.018710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.019855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.020371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.020762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.020787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.020804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.020820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.025047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.025417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.026728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.027068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.027456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.027814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.028160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.029240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.029795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.030196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.030231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.030250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.030266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.034539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.034912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.036309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.036664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.037064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.037441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.037795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.038988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.039434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.039841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.039879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.039893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.039906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.044221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.044579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.046048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.046416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.046802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.047151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.047505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.048814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.049145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.049554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.049581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.049598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.049613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.053933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.054288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.055791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.056121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.056526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.056856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.057155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.058421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.058774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.059166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.234 [2024-05-15 04:30:34.059193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.059210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.059226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.063458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.063810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.065283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.065638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.066039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.066420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.066773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.068880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.073229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.074893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.075231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.075584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.075880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.076217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.076574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.078216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.079617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.079976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.079997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.080011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.080024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.083642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.084007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.084363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.085637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.086030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.086417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.087752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.088917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.091911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.092263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.092616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.092679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.092962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.093310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.093661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.095664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.099099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.099161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.099225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.099936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.100321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.100386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.100440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.100493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.100852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.101206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.101231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.101247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.101264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.106354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.106709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.107821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.108359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.108752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.109721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.110411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.110762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.111099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.111446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.111471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.111487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.111502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.115740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.116086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.116783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.117742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.118122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.118677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.119789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.120939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.124419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.124777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.235 [2024-05-15 04:30:34.125110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.126468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.126863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.127231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.128714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.129818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.133698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.134046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.135562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.135929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.136352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.137895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.138239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.138648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.139914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.140328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.140353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.140370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.140386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.145212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.145580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.145956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.146321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.146723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.148335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.148690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.149037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.150594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.151032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.151053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.151083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.151097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.155207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.155559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.156384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.157192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.157572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.158532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.159573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.161023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.161767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.162033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.162055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.162073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.162087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.166051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.166600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.166660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.167846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.168121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.169607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.170565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.172237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.173808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.174072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.174093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.174107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.174135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.177706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.178750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.179916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.181370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.181644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.182343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.183792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.185669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.188486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.188846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.188917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.190046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.190337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.191809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.191876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.192847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.192915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.193182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.193206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.193222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.193238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.197181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.198163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.198231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.198617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.199003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.200353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.200412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.236 [2024-05-15 04:30:34.201890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.201956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.202216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.202240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.202257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.202271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.206442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.207929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.207987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.208336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.208678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.209951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.210766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.214015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.215486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.215546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.216302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.216579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.217816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.221113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.222637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.222698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.224337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.224610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.226236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.226296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.227843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.230053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.231675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.231735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.231788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.232085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.233377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.233436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.234855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.234925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.235193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.235218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.235234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.235249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.238449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.238509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.238563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.238615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.239015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.240995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.245685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.245756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.245815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.245893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.237 [2024-05-15 04:30:34.246933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.251868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.252158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.252196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.252212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.252228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.256897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.256955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.257986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.261728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.262041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.262062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.262076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.262088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.265546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.265605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.500 [2024-05-15 04:30:34.265667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.265713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.265980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.266480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.270963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.271013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.271332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.271369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.271385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.271400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.274851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.274921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.274975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.275787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.279443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.279502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.279569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.279615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.279982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.280477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.284962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.285216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.285237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.285251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.285264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.287650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.287709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.287776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.287821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.288587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.292542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.292606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.292676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.292721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.293573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.296999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.297050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.297095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.297140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.501 [2024-05-15 04:30:34.297426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.297488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.297535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.297581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.297630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.298017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.298039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.298053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.298066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.300550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.300609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.300673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.300718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.300962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.301446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.304973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.305024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.305256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.305277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.305291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.305304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.309932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.309983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.310994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.314221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.314287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.314352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.315704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.316928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.321247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.322332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.323647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.323701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.323957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.324264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.324606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.324911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.324960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.325301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.325327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.325346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.325360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.329776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.330037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.330061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.330075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.330088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.502 [2024-05-15 04:30:34.332744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.332790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.332860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.333177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.333212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.333226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.333238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.336525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.336605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.336652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.336697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.337518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.341664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.342043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.342065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.342079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.342093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.345998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.346019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.346033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.346045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.349856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.350099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.350120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.350150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.350162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.354998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.355516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.358923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.358975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.359447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.360716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.361067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.361089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.361105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.361118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.364949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.365015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.365550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.365616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.365864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.365926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.503 [2024-05-15 04:30:34.367183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.367249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.368483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.368789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.368811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.368832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.368847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.372403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.372471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.373451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.373518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.373749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.373808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.375189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.375259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.376638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.376993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.377016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.377030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.377043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.380311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.380381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.381609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.381676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.381917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.381976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.383219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.383285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.383876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.384213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.384250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.384264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.384277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.388496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.388562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.389650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.389715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.389964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.390022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.391384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.391452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.392589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.392934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.392955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.392970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.392982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.397117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.397184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.397764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.397835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.398115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.398187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.399453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.399519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.399876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.400227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.400248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.400262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.400275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.404640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.404711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.405828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.406143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.406486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.406557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.406862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.406910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.407204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.407437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.407457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.407475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.407489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.409804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.410127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.410428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.410729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.411073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.411143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.411452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.411748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.412077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.412381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.412401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.412415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.412427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.415138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.415457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.415756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.416089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.504 [2024-05-15 04:30:34.416429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.416763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.417097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.417413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.417712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.418053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.418075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.418089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.418103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.420648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.420990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.421310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.421646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.422004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.422330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.422629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.422932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.423234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.423592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.423612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.423626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.423639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.426108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.426437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.426735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.427063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.427404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.427742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.428071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.428388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.428687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.429018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.429040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.429055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.429068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.431683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.432005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.432303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.432631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.432992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.433298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.433627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.433939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.434241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.434569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.434590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.434605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.434618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.437075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.437391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.437693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.438434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.438714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.439028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.439326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.439623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.439927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.440183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.440203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.440217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.440229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.443814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.444132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.444935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.445913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.446150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.447030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.447348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.447647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.447949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.448256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.448276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.448305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.448323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.450805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.451130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.451457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.451758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.452108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.452414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.452714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.505 [2024-05-15 04:30:34.453684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.456150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.456464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.456790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.457106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.457432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.457760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.458104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.458419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.458748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.459076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.459113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.459127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.459139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.461642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.461986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.462304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.462634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.463010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.463352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.463653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.464717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.467295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.467615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.467943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.468295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.468661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.469006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.469317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.469664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.470670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.471005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.471027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.471041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.471055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.474757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.475115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.475445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.475771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.476161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.476485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.476787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.477863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.481304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.481619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.482081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.483088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.483323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.483950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.484863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.485900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.486236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.486567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.486603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.486617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.486630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.489372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.490671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.491332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.492200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.492447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.493660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.494000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.494320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.494648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.495018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.495040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.495055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.495089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.496771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.497802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.499104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.500417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.500705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.501042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.501388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.501711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.502051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.502318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.502340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.502354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.502367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.504704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.506082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.507396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.506 [2024-05-15 04:30:34.507745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.508137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.508513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.508854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.509179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.510399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.510681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.510706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.510720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.507 [2024-05-15 04:30:34.510734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.513357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.514714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.515059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.515113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.515461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.515794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.516140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.516959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.517009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.517271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.517291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.517305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.517317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.518687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.518752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.518798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.520795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.521167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.521207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.521223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.521237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.523883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.525247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.525679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.526665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.526924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.528195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.529303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.529619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.529945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.530280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.530315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.530328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.530341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.532976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.533545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.769 [2024-05-15 04:30:34.534551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.535791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.536067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.537026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.537326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.537623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.537956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.538280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.538317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.538331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.538344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.540180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.541221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.542506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.543761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.544057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.544363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.544690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.544993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.545289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.545554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.545576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.545590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.545602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.547857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.549103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.550355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.550681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.551038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.551343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.551670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.552038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.553071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.553306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.553326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.553340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.553353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.555697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.556974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.557274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.557572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.557888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.558193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.558734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.559772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.561072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.561311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.561331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.561345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.561357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.564012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.564314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.564611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.564933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.565290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.566020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.567015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.568272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.569555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.569869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.569891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.569905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.569917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.571387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.571702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.571755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.572057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.572386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.573502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.574500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.575736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.577067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.577395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.577415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.577444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.577456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.579011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.579327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.579625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.579933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.580167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.581245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.582504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.583940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.584007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.770 [2024-05-15 04:30:34.584270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.584291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.584304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.584317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.585656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.585977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.586027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.586323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.586647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.587144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.587211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.588558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.589872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.590191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.590241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.590538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.590875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.591178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.591242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.592769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.594475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.594780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.594850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.595174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.595489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.596787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.596859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.597849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.597900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.598132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.598153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.598166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.598178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.599673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.599994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.600043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.600338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.600604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.601837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.601903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.603646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.604890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.605192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.605241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.605537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.605880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.606192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.606241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.607642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.608962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.610212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.610278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.610324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.610550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.611831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.771 [2024-05-15 04:30:34.613289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.613353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.613398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.613446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.613675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.614782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.616970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.617270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.617291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.617318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.617331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.618694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.618760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.618805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.618873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.619646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.621973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.622245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.622265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.622279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.622291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.623614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.623680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.623727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.623771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.624531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.772 [2024-05-15 04:30:34.626772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.626818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.627076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.627097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.627111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.627138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.628956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.629467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.631942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.633984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.634328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.634349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.634363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.634377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.635953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.636892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.638867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.639237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.639258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.639273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.639286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.640651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.640717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.640766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.640811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.641665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.643145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.643194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.773 [2024-05-15 04:30:34.643256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.643810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.644137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.644173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.644188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.644201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.645978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.646346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.647933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.647982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.648911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.650813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.651137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.651173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.651187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.651200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.652833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.652886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.652931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.654177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.654425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.654487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.654534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.654580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.655037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.655283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.655303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.655317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.655330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.656964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.657288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.657595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.657667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.657924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.659081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.660519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.661901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.661967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.774 [2024-05-15 04:30:34.662247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.662267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.662281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.662293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.663629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.663694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.663740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.663784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.664762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.666775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.667058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.667081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.667095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.667108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.668908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.668961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.669932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.671464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.671529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.671579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.671624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.672662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.674998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.675296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.675317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.675346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.675359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.677778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.678107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.678129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.775 [2024-05-15 04:30:34.678157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.678170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.679868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.679934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.680864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.681224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.681245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.681275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.681289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.682888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.682954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.683906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.684280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.684301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.684316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.684330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.686258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.686325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.686622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.686671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.686980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.687043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.687360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.687424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.687739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.688107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.688129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.688159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.688172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.690161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.690227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.690523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.690572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.690960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.691032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.691349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.691404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.691702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.692098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.692135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.692150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.692162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.694877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.695905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.697865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.697932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.698251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.698305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.698577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.698635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.698957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.699766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.776 [2024-05-15 04:30:34.701717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.701783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.702879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.703195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.703514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.703535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.703564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.703578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.706351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.706417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.707668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.708010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.708391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.708461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.708756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.708804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.709127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.709435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.709456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.709469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.709481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.711736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.713033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.713367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.713665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.714005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.714064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.714377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.715135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.716188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.716436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.716456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.716470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.716483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.718373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.718687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.719016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.719362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.719690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.720046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.720343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.720641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.720966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.721299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.721320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.721334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.721347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.723323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.723641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.723945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.724243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.724568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.724906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.725206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.725504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.725801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.726170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.726192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.726207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.726222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.728154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.728469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.728767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.729095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.729384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.729691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.730043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.730359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.730687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.731057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.731080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.731110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.731124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.732965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.733298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.733598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.733922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.734247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.734583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.734907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.735228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.777 [2024-05-15 04:30:34.735558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.735903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.735926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.735940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.735953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.737923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.738321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.739387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.739702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.740067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.740390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.740688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.741517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.742133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.742444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.742466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.742479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.742492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.744582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.744924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.745245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.745573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.745943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.746271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.746607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.746933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.747279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.747607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.747642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.747657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.747670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.750610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.752033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.752366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.752665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.752996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.753348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.753999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.754916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.756217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.756520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.756541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.756555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.756568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.758323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.758637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.759025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.760122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.760370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.761653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.762803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.763940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.764956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.765219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.765244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.765258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.765271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.767108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.767438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.768586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.769881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.770148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.771411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.772421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.773451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.774748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.775023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.775046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.775060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.775072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.777761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.778792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:46.778 [2024-05-15 04:30:34.780153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.781373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.781716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.783140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.784434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.785877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.787193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.787505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.787526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.787539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.787552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.790076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.791363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.792654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.793184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.793429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.794629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.796037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.797469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.797805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.798140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.798177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.798192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.798205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.800884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.802157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.802629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.803710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.803984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.805279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.806355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.806671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.806976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.807303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.807324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.807338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.807351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.809980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.810420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.811592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.812863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.813096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.814299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.814620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.814926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.815223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.815573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.815595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.815610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.815623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.817237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.818250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.819523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.820779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.821084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.821421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.821749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.822642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.825018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.826243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.827523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.828467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.828805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.829117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.829417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.829739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.831055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.831352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.831377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.831392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.041 [2024-05-15 04:30:34.831404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.834018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.835335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.836348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.836413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.836728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.837067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.837384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.837683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.837732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.837980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.838001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.838014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.838028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.839335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.839399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.839444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.840573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.840818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.840885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.840937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.840986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.842064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.842407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.842428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.842457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.842469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.845012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.846261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.847548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.848054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.848342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.849629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.850876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.851801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.852124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.852445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.852480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.852494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.852508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.855192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.856484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.856912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.857955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.858221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.859508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.860640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.860973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.861271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.861615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.861637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.861650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.861663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.864211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.864685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.865708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.866979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.867249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.868335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.868666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.868993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.869339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.869686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.869707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.869721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.869734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.871476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.872675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.873932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.875167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.875424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.875729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.876084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.876400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.876731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.876992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.877015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.877029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.877042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.878688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.879039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.879356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.879683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.880071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.880421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.881289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.881883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.882218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.882546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.882581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.882604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.882632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.885624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.042 [2024-05-15 04:30:34.887092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.888423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.889626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.890000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.890327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.890656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.890981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.891924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.892202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.892224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.892238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.892257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.895186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.896521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.896588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.897408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.897777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.898125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.898449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.898756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.900287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.900541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.900561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.900575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.900588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.903161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.904471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.905127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.905465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.905820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.906184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.906484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.907938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.908006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.908254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.908275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.908290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.908303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.909596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.910900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.910967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.912402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.912728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.913920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.915195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.915718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.915783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.916776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.917054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.918343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.918409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.918958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.919030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.919443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.919464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.919494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.919509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.921155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.922435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.922500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.923764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.924139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.925622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.925690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.927400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.929040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.929372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.929424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.930551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.930799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.932203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.932268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.933613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.934947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.935283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.935341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.935638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.936004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.936706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.043 [2024-05-15 04:30:34.936772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.937791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.937865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.938118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.938139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.938154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.938167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.939530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.940835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.940901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.940949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.941330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.941649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.941699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.942485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.943733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.943798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.943869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.943925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.944230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.945697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.947999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.948253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.948274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.948288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.948301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.949587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.949651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.949700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.949747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.950509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.952888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.953139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.953160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.953174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.953186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.954994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.955577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.957808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.958234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.044 [2024-05-15 04:30:34.958255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.958284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.958297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.959577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.959642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.959689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.959734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.960714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.962753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.963049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.963072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.963086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.963099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.964482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.964547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.964593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.964639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.965652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.966933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.966999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.967912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.969474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.969538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.969584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.969628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.969999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.970549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.971946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.045 [2024-05-15 04:30:34.972693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.973089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.973112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.973142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.973156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.974594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.974665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.974712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.974756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.975661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.977970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.978482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.980974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.981361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.981382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.981413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.981426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.983726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.984097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.984119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.984149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.984164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.985772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.985846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.985920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.986238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.986583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.986647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.986709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.986760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.987092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.987415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.987437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.987451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.987464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.989410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.989733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.990063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.990129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.990441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.990745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.991860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.993560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.993631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.993677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.993722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.994655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.046 [2024-05-15 04:30:34.996359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.996989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.997468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:34.999999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.000376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.000399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.000413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.000427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.002733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.003106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.003130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.003145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.003159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.004767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.004840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.004904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.004950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.005891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.007555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.007619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.007665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.007710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.008630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.010272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.010339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.010636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.010690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.010961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.011476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.012778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.012850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.012914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.012961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.013331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.013401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.013448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.013498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.013796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.014181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.014218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.014232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.014245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.016250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.016318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.017341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.017406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.017638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.047 [2024-05-15 04:30:35.017695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.018704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.020836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.020903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.021202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.021260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.021643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.021714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.022774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.024724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.024790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.025973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.026288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.026631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.026652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.026666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.026679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.028604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.028671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.028995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.029061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.029418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.029476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.029775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.029850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.030190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.030545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.030565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.030579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.030609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.032603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.032672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.032981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.033031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.033338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.033409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.033720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.033769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.034097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.034465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.034502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.034522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.034535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.036539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.036610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.036943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.037265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.037589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.037659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.037996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.038731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.040709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.041055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.041373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.041701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.042075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.042152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.042463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.042761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.043445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.043722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.043744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.043757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.043771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.045631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.045973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.046302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.046639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.047001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.047324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.047652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.047977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.048 [2024-05-15 04:30:35.048343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.048730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.048761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.048792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.048806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.051120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.049 [2024-05-15 04:30:35.051722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.052635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.052975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.053328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.053645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.053986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.054577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.055479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.055731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.055752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.055767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.055780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.057532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.057908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.058278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.059446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.059721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.061025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.062311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.062782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.063783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.064090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.064113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.064127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.064140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.066153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.067579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.068819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.070286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.070556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.071126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.072215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.073491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.074777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.075128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.075188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.075203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.075216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.078443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.079788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.081261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.082471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.082744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.083766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.085059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.086345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.086851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.087238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.087260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.087295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.087309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.090336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.091627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.092700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.093955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.094279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.095584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.096882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.097927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.100750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.101991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.103049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.104067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.104333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.105604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.106153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.106470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.106771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.107172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.107195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.107210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.107224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.311 [2024-05-15 04:30:35.109740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.110808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.111859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.113149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.113400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.113881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.114221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.114520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.114821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.115211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.115232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.115245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.115257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.117146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.118198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.119479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.120798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.121129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.121459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.121785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.122715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.125041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.126328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.127619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.128094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.128451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.128786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.129123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.129442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.130856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.131126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.131146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.131160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.131173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.133644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.134937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.135637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.135978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.136340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.136661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.136987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.138302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.139516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.139762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.139783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.139797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.139831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.142423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.143135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.143455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.143754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.144110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.144431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.145709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.146886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.148241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.148490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.148511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.148525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.148542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.150723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.151069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.151386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.151687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.152059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.153092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.154098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.155365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.156655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.156993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.157015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.157029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.157042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.158666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.159009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.159328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.159657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.159917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.312 [2024-05-15 04:30:35.160929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.162212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.163493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.164063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.164371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.164392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.164406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.164419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.166050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.166387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.166688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.167858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.168157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.169419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.170734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.171164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.172192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.172440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.172461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.172475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.172487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.174333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.174653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.175917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.175988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.176239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.177509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.178915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.179735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.179801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.180101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.180123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.180153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.180165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.181847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.181915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.181962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.182379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.182635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.182697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.182745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.182791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.183556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.183872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.183894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.183909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.183922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.185957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.186465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.187445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.187760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.188109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.188445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.188743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.189072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.190493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.190896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.190917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.190931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.190944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.192514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.192851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.193175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.193503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.193738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.195031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.196495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.197839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.198767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.199066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.199088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.199102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.199115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.200903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.201222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.201581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.202672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.202948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.204273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.205415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.206550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.207599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.207874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.207895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.207909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.207922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.209728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.210094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.211208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.212443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.212692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.213908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.214994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.216015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.313 [2024-05-15 04:30:35.217309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.217558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.217579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.217592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.217605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.219892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.220917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.222156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.223433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.223733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.225074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.226306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.227692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.229098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.229437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.229459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.229473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.229486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.232075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.233335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.234573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.235261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.235513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.236695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.238051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.239525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.239865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.240241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.240262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.240290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.240304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.243044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.244340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.244406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.244808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.245091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.246532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.247813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.248924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.249260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.249595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.249632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.249646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.249660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.252387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.253664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.254144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.255188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.255439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.256740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.257913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.258699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.260298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.261696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.261762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.262735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.263012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.264019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.264086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.265713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.267419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.268369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.268434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.269519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.269766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.271135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.271185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.271742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.271807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.272092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.272114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.272143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.272156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.273642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.273985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.274040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.274355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.274693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.275818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.275889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.277050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.277118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.277404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.314 [2024-05-15 04:30:35.277424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.277438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.277451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.278760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.279102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.279168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.279469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.279849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.280425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.280490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.281855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.283273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.284578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.284645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.284969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.285338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.285673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.285727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.286463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.287748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.289228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.289293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.289341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.289573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.290842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.290907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.291770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.291841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.292185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.292224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.292239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.292253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.293913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.293981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.294028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.294074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.294321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.295605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.297934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.298213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.298235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.298248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.298261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.299697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.299771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.315 [2024-05-15 04:30:35.299842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.299890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.300874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.302415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.302485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.302536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.302581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.302954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.303545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.305952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.306312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.306333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.306362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.306376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.308837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.309134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.309156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.309169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.309182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.310883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.310950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.310998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.311996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.312011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.312024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.313675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.313740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.313790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.313860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.314797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.316526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.316592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.316638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.316683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.317685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.319555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.319618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.319672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.319721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.316 [2024-05-15 04:30:35.320040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.320682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.322631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.322684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.322734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.317 [2024-05-15 04:30:35.322802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.323914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.325613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.325663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.325714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.325759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.326668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.328415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.328488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.328537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.328581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.328947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.329543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.331984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.332300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.332320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.332339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.332353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.334943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.335340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.335363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.335392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.335406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.337772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.338031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.338053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.338067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.338081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.580 [2024-05-15 04:30:35.339844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.339913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.339962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.340394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.340642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.340704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.340751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.340795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.342050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.342410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.342431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.342444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.342456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.343945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.344282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.344583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.344632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.344992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.345891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.346907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.348592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.350993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.351039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.351389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.351413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.351429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.351444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.353838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.354099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.354135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.354153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.354168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.356814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.357176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.357201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.357217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.357238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.358969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.359757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.360153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.360189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.360207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.360222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.362870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.363216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.363241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.363257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.363272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.581 [2024-05-15 04:30:35.365602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.365661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.365721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.365775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.366154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.366180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.366198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.366214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.367990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.368991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.369383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.369409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.369426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.369442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.371955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.373430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.373847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.373888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.373903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.373917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.376030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.376096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.376765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.376832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.377107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.377190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.377540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.377596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.377953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.378253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.378279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.378295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.378310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.380623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.380684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.381922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.383452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.383904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.383926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.383955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.383973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.386069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.386142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.387719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.387800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.388072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.388147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.388615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.388674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.389019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.389315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.389340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.389356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.389371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.391632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.391694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.392881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.392940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.393323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.393388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.394230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.394288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.394830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.395205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.395231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.395248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.395264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.397330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.397393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.398962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.399033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.399445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.399511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.399869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.399942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.400297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.582 [2024-05-15 04:30:35.400650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.400674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.400689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.400704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.402728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.402789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.403183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.404406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.404707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.404774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.405796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.405862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.406351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.406624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.406648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.406664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.406678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.409020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.410141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.411609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.412818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.413117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.413184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.414233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.414805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.416437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.416862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.416899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.416913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.416926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.419916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.421379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.422843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.423833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.424088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.425256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.426718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.428199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.428553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.428984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.429008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.429037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.429051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.432250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.433682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.434460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.435996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.436282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.437800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.439427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.439779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.440112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.440480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.440506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.440522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.440537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.443638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.444302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.445742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.447252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.447523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.449029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.449401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.449757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.450093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.450497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.450522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.450539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.450556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.452696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.454166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.455696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.457028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.457278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.457586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.457964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.458306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.458658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.458998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.459020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.459034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.459046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.462022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.463551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.465011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.466110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.466476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.466848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.467197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.467549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.468709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.469025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.469046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.469060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.469073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.472037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.583 [2024-05-15 04:30:35.473504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.474513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.474872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.475270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.475629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.475986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.477106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.478264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.478537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.478561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.478577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.478592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.481562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.482606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.482982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.483334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.483688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.484057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.485239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.486391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.487840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.488097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.488133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.488147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.488159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.490685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.491034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.491402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.491756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.492154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.493429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.494592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.496054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.497653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.498010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.498031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.498045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.498057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.499954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.500308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.500665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.501013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.501288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.502511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.503989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.505628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.506509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.506877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.506914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.506928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.506941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.508991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.509356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.509728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.510992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.511271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.512839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.514134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.515375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.516542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.516815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.516848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.516865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.516895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.518988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.519621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.520788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.522215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.522487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.523644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.525065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.526391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.527895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.528161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.528186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.528202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.528217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.530867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.532023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.533480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.534943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.584 [2024-05-15 04:30:35.535300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.536935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.538459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.540043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.541462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.541789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.541813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.541837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.541853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.544854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.546319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.547770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.548310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.548596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.550190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.551655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.552851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.553212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.553602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.553627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.553644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.553660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.556784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.558256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.558609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.558971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.559397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.559754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.560092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.560452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.560806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.561117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.561158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.561173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.561199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.563608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.564443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.565227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.565578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.565987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.566354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.566705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.567296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.568305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.568630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.568654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.568670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.568685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.570576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.570939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.571272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.571330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.571629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.573038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.574513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.575987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.576056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.576395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.576419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.576435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.576450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.578976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.579703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.582371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.583844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.585288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.585756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.586139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.586512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.586887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.587387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.588593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.588882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.588907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.588923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.585 [2024-05-15 04:30:35.588947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.846 [2024-05-15 04:30:35.591957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.846 [2024-05-15 04:30:35.593517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.846 [2024-05-15 04:30:35.593891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.846 [2024-05-15 04:30:35.594232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.846 [2024-05-15 04:30:35.594598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.594969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.595688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.596889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.598360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.598633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.598657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.598673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.598688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.601837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.602195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.602545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.602922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.603298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.604289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.605439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.606899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.608337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.608684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.608709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.608725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.608740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.610547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.610920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.611289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.611639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.611927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.613060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.614533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.616003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.616682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.616998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.617020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.617039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.617053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.619014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.619384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.619736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.621159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.621444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.623060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.624610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.625649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.626839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.627098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.627134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.627151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.627166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.629306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.629923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.631035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.632479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.632751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.633971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.635321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.636682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.638213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.638483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.638508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.638524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.638539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.641583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.642751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.642816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.644313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.644586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.645077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.646256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.647589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.648831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.649216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.649242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.649258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.649273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.652656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.654110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.654731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.655910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.656171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.657494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.658666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.659513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.661372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.662831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.662892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.663699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.663980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.665231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.665291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.666908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.847 [2024-05-15 04:30:35.666974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.667222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.667247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.667263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.667280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.669288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.670838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.670899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.672326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.672594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.674011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.674076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.675906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.677733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.678074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.678139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.679646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.679959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.680688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.680755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.682537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.684561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.686205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.686266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.686704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.686985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.688505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.688574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.688933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.688997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.689378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.689404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.689421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.689436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.691243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.691599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.691658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.692025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.692424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.692781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.692845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.693636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.695678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.696911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.697738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.699680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.699738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.699791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.699850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.700988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.701017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.701029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.703786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.704146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.704172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.704188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.704204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.848 [2024-05-15 04:30:35.706379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.706444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.706504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.706557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.706936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.707613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.709607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.709672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.709735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.709787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.710850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.712819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.712884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.712945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.712990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.713364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.713430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.713484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.713540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.713594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.714011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.714032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.714063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.714076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.716917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.717243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.717267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.717283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.717299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.719979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.720501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.722490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.722548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.722612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.722669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.723688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.725705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.725777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.725837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.725910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.726891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.728747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.728809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.728871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.728934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.849 [2024-05-15 04:30:35.729183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.729747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.731630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.731688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.731740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.731792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.732713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.734979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.735234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.735258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.735274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.735289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.737964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.738010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.738274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.738298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.738315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.738330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.739915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.739979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.740639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.741025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.741047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.741077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.741090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.742942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.743652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.744016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.744037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.744051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.744078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.746855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.747225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.747251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.747268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.747284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.749204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.749263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.749321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.749667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.750039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.750099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.750169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.750221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.850 [2024-05-15 04:30:35.750568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.750934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.750959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.750975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.750991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.753242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.753596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.753974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.754039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.754407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.754764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.755971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.757978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.758703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.759033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.759070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.759083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.759095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.761818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.762226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.762252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.762268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.762284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.764999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.765339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.765364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.765381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.765396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.767143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.767201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.767652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.767711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.767763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.768139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.768177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.768192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.843393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.843479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.843778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.846385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.846460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.847602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.849042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.850255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.850666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.850744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.851057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.851132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.851454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.851520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.852575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.851 [2024-05-15 04:30:35.852655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.853835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.854111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.854132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.854145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.854158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.857183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.858710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.859080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:47.852 [2024-05-15 04:30:35.859495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.859839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.860202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.860904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.862060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.863531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.863807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.863838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.863856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.863885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.866672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.867162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.867527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.867901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.868280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.868669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.869996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.871419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.872896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.873209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.873234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.873256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.873273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.875335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.875696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.876043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.876411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.876799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.878348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.879991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.881470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.882788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.883101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.883138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.883155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.883170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.884807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.885157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.885508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.110 [2024-05-15 04:30:35.885864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.111 [2024-05-15 04:30:35.886139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.111 [2024-05-15 04:30:35.887281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.111 [2024-05-15 04:30:35.888730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.111 [2024-05-15 04:30:35.890167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:29:48.111 [2024-05-15 04:30:35.891580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:29:48.369 00:29:48.369 Latency(us) 00:29:48.369 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:48.369 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x0 length 0x100 00:29:48.369 crypto_ram : 5.75 44.54 2.78 0.00 0.00 2784973.56 337097.77 2261817.27 00:29:48.369 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x100 length 0x100 00:29:48.369 crypto_ram : 5.74 47.23 2.95 0.00 0.00 2639586.70 2706.39 2087831.32 00:29:48.369 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x0 length 0x100 00:29:48.369 crypto_ram1 : 5.75 44.54 2.78 0.00 0.00 2691050.76 337097.77 2075403.76 00:29:48.369 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x100 length 0x100 00:29:48.369 crypto_ram1 : 5.74 47.39 2.96 0.00 0.00 2546700.85 2463.67 1901417.81 00:29:48.369 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x0 length 0x100 00:29:48.369 crypto_ram2 : 5.54 308.33 19.27 0.00 0.00 374947.27 25826.04 568561.21 00:29:48.369 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x100 length 0x100 00:29:48.369 crypto_ram2 : 5.55 331.80 20.74 0.00 0.00 349064.55 41554.68 546812.97 00:29:48.369 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x0 length 0x100 00:29:48.369 crypto_ram3 : 5.62 319.24 19.95 0.00 0.00 353041.59 13689.74 357292.56 00:29:48.369 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:29:48.369 Verification LBA range: start 0x100 length 0x100 00:29:48.369 crypto_ram3 : 5.63 340.96 21.31 0.00 0.00 331267.39 29321.29 466033.78 00:29:48.369 =================================================================================================================== 00:29:48.369 Total : 1484.04 92.75 0.00 0.00 644634.37 2463.67 2261817.27 00:29:48.934 00:29:48.934 real 0m9.135s 00:29:48.934 user 0m17.105s 00:29:48.934 sys 0m0.673s 00:29:48.934 04:30:36 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:48.934 04:30:36 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:29:48.934 ************************************ 00:29:48.934 END TEST bdev_verify_big_io 00:29:48.934 ************************************ 00:29:48.934 04:30:36 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:48.934 04:30:36 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:48.934 04:30:36 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:48.934 04:30:36 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:48.934 ************************************ 00:29:48.934 START TEST bdev_write_zeroes 00:29:48.934 ************************************ 00:29:48.934 04:30:36 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:48.934 [2024-05-15 04:30:36.820333] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:48.934 [2024-05-15 04:30:36.820403] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3998632 ] 00:29:48.934 [2024-05-15 04:30:36.902513] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:49.192 [2024-05-15 04:30:37.022889] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:49.192 [2024-05-15 04:30:37.044063] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:29:49.192 [2024-05-15 04:30:37.052127] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:29:49.192 [2024-05-15 04:30:37.060132] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:29:49.192 [2024-05-15 04:30:37.170646] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:29:51.719 [2024-05-15 04:30:39.521910] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:29:51.719 [2024-05-15 04:30:39.521996] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:29:51.719 [2024-05-15 04:30:39.522016] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.719 [2024-05-15 04:30:39.529928] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:29:51.719 [2024-05-15 04:30:39.529957] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:29:51.719 [2024-05-15 04:30:39.529972] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.719 [2024-05-15 04:30:39.537948] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:29:51.719 [2024-05-15 04:30:39.537976] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:29:51.719 [2024-05-15 04:30:39.537989] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.719 [2024-05-15 04:30:39.545970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:29:51.719 [2024-05-15 04:30:39.545997] bdev.c:8109:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:29:51.719 [2024-05-15 04:30:39.546011] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:29:51.719 Running I/O for 1 seconds... 00:29:53.092 00:29:53.092 Latency(us) 00:29:53.092 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:53.092 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:53.092 crypto_ram : 1.03 1927.29 7.53 0.00 0.00 65952.81 5728.33 78060.66 00:29:53.092 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:53.092 crypto_ram1 : 1.03 1933.09 7.55 0.00 0.00 65389.46 5339.97 73011.96 00:29:53.092 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:53.092 crypto_ram2 : 1.02 14806.67 57.84 0.00 0.00 8518.22 2366.58 10582.85 00:29:53.092 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:29:53.092 crypto_ram3 : 1.02 14841.28 57.97 0.00 0.00 8474.07 2293.76 9175.04 00:29:53.092 =================================================================================================================== 00:29:53.092 Total : 33508.33 130.89 0.00 0.00 15110.11 2293.76 78060.66 00:29:53.092 00:29:53.092 real 0m4.328s 00:29:53.092 user 0m3.806s 00:29:53.092 sys 0m0.481s 00:29:53.092 04:30:41 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:53.092 04:30:41 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:29:53.092 ************************************ 00:29:53.092 END TEST bdev_write_zeroes 00:29:53.092 ************************************ 00:29:53.350 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:53.350 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:53.350 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:53.350 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:53.350 ************************************ 00:29:53.350 START TEST bdev_json_nonenclosed 00:29:53.350 ************************************ 00:29:53.350 04:30:41 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:53.350 [2024-05-15 04:30:41.192714] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:53.351 [2024-05-15 04:30:41.192784] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3999177 ] 00:29:53.351 [2024-05-15 04:30:41.272317] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:53.608 [2024-05-15 04:30:41.388882] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:53.608 [2024-05-15 04:30:41.388980] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:29:53.609 [2024-05-15 04:30:41.389007] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:53.609 [2024-05-15 04:30:41.389033] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:53.609 00:29:53.609 real 0m0.372s 00:29:53.609 user 0m0.253s 00:29:53.609 sys 0m0.117s 00:29:53.609 04:30:41 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:53.609 04:30:41 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:29:53.609 ************************************ 00:29:53.609 END TEST bdev_json_nonenclosed 00:29:53.609 ************************************ 00:29:53.609 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:53.609 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:29:53.609 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:53.609 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:53.609 ************************************ 00:29:53.609 START TEST bdev_json_nonarray 00:29:53.609 ************************************ 00:29:53.609 04:30:41 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:29:53.609 [2024-05-15 04:30:41.614145] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:53.609 [2024-05-15 04:30:41.614216] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3999198 ] 00:29:53.866 [2024-05-15 04:30:41.694173] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:53.866 [2024-05-15 04:30:41.813154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:53.866 [2024-05-15 04:30:41.813275] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:29:53.866 [2024-05-15 04:30:41.813303] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:29:53.866 [2024-05-15 04:30:41.813318] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:29:54.124 00:29:54.124 real 0m0.383s 00:29:54.124 user 0m0.270s 00:29:54.124 sys 0m0.110s 00:29:54.124 04:30:41 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:54.124 04:30:41 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:29:54.124 ************************************ 00:29:54.124 END TEST bdev_json_nonarray 00:29:54.124 ************************************ 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:29:54.124 04:30:41 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:29:54.124 00:29:54.124 real 1m12.968s 00:29:54.124 user 2m35.667s 00:29:54.124 sys 0m9.065s 00:29:54.124 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@1122 -- # xtrace_disable 00:29:54.124 04:30:41 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:29:54.124 ************************************ 00:29:54.124 END TEST blockdev_crypto_qat 00:29:54.124 ************************************ 00:29:54.124 04:30:41 -- spdk/autotest.sh@356 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:29:54.124 04:30:41 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:29:54.124 04:30:41 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:29:54.124 04:30:41 -- common/autotest_common.sh@10 -- # set +x 00:29:54.125 ************************************ 00:29:54.125 START TEST chaining 00:29:54.125 ************************************ 00:29:54.125 04:30:42 chaining -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:29:54.125 * Looking for test storage... 00:29:54.125 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@7 -- # uname -s 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8b464f06-2980-e311-ba20-001e67a94acd 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=8b464f06-2980-e311-ba20-001e67a94acd 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:54.125 04:30:42 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:54.125 04:30:42 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:54.125 04:30:42 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:54.125 04:30:42 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:54.125 04:30:42 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:54.125 04:30:42 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:54.125 04:30:42 chaining -- paths/export.sh@5 -- # export PATH 00:29:54.125 04:30:42 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@47 -- # : 0 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:29:54.125 04:30:42 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:29:54.125 04:30:42 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:29:54.125 04:30:42 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:29:54.125 04:30:42 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:29:54.125 04:30:42 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@296 -- # e810=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@297 -- # x722=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@298 -- # mlx=() 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:82:00.0 (0x8086 - 0x159b)' 00:29:56.653 Found 0000:82:00.0 (0x8086 - 0x159b) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:82:00.1 (0x8086 - 0x159b)' 00:29:56.653 Found 0000:82:00.1 (0x8086 - 0x159b) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:82:00.0: cvl_0_0' 00:29:56.653 Found net devices under 0000:82:00.0: cvl_0_0 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:82:00.1: cvl_0_1' 00:29:56.653 Found net devices under 0000:82:00.1: cvl_0_1 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:29:56.653 04:30:44 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:29:56.911 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:29:56.911 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.162 ms 00:29:56.911 00:29:56.911 --- 10.0.0.2 ping statistics --- 00:29:56.911 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:29:56.911 rtt min/avg/max/mdev = 0.162/0.162/0.162/0.000 ms 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:29:56.911 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:29:56.911 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.092 ms 00:29:56.911 00:29:56.911 --- 10.0.0.1 ping statistics --- 00:29:56.911 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:29:56.911 rtt min/avg/max/mdev = 0.092/0.092/0.092/0.000 ms 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@422 -- # return 0 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:29:56.911 04:30:44 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:29:56.911 04:30:44 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:29:56.911 04:30:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@481 -- # nvmfpid=4001356 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:29:56.911 04:30:44 chaining -- nvmf/common.sh@482 -- # waitforlisten 4001356 00:29:56.911 04:30:44 chaining -- common/autotest_common.sh@827 -- # '[' -z 4001356 ']' 00:29:56.911 04:30:44 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:56.911 04:30:44 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:29:56.912 04:30:44 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:56.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:56.912 04:30:44 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:29:56.912 04:30:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:56.912 [2024-05-15 04:30:44.777384] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:56.912 [2024-05-15 04:30:44.777465] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:56.912 [2024-05-15 04:30:44.869112] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:57.169 [2024-05-15 04:30:44.987153] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:29:57.169 [2024-05-15 04:30:44.987216] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:29:57.169 [2024-05-15 04:30:44.987244] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:29:57.169 [2024-05-15 04:30:44.987255] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:29:57.169 [2024-05-15 04:30:44.987265] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:29:57.169 [2024-05-15 04:30:44.987308] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@860 -- # return 0 00:29:57.734 04:30:45 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.734 04:30:45 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@69 -- # mktemp 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.fgdYC0JyCH 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@69 -- # mktemp 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.piGO0SA8Kc 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:29:57.734 04:30:45 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:57.734 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.734 malloc0 00:29:57.734 true 00:29:57.992 true 00:29:57.992 [2024-05-15 04:30:45.755260] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:29:57.992 crypto0 00:29:57.992 [2024-05-15 04:30:45.763276] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:29:57.992 crypto1 00:29:57.992 [2024-05-15 04:30:45.771401] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:29:57.992 [2024-05-15 04:30:45.787386] nvmf_rpc.c: 615:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:29:57.992 [2024-05-15 04:30:45.787675] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@85 -- # update_stats 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # opcode= 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:57.992 04:30:45 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.fgdYC0JyCH bs=1K count=64 00:29:57.992 64+0 records in 00:29:57.992 64+0 records out 00:29:57.992 65536 bytes (66 kB, 64 KiB) copied, 0.00043788 s, 150 MB/s 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.fgdYC0JyCH --ob Nvme0n1 --bs 65536 --count 1 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@25 -- # local config 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:29:57.992 04:30:45 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:29:57.992 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:29:57.992 04:30:46 chaining -- bdev/chaining.sh@31 -- # config='{ 00:29:57.992 "subsystems": [ 00:29:57.992 { 00:29:57.992 "subsystem": "bdev", 00:29:57.992 "config": [ 00:29:57.992 { 00:29:57.993 "method": "bdev_nvme_attach_controller", 00:29:57.993 "params": { 00:29:57.993 "trtype": "tcp", 00:29:57.993 "adrfam": "IPv4", 00:29:57.993 "name": "Nvme0", 00:29:57.993 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:29:57.993 "traddr": "10.0.0.2", 00:29:57.993 "trsvcid": "4420" 00:29:57.993 } 00:29:57.993 }, 00:29:57.993 { 00:29:57.993 "method": "bdev_set_options", 00:29:57.993 "params": { 00:29:57.993 "bdev_auto_examine": false 00:29:57.993 } 00:29:57.993 } 00:29:57.993 ] 00:29:57.993 } 00:29:57.993 ] 00:29:57.993 }' 00:29:57.993 04:30:46 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.fgdYC0JyCH --ob Nvme0n1 --bs 65536 --count 1 00:29:57.993 04:30:46 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:29:57.993 "subsystems": [ 00:29:57.993 { 00:29:57.993 "subsystem": "bdev", 00:29:57.993 "config": [ 00:29:57.993 { 00:29:57.993 "method": "bdev_nvme_attach_controller", 00:29:57.993 "params": { 00:29:57.993 "trtype": "tcp", 00:29:57.993 "adrfam": "IPv4", 00:29:57.993 "name": "Nvme0", 00:29:57.993 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:29:57.993 "traddr": "10.0.0.2", 00:29:57.993 "trsvcid": "4420" 00:29:57.993 } 00:29:57.993 }, 00:29:57.993 { 00:29:57.993 "method": "bdev_set_options", 00:29:57.993 "params": { 00:29:57.993 "bdev_auto_examine": false 00:29:57.993 } 00:29:57.993 } 00:29:57.993 ] 00:29:57.993 } 00:29:57.993 ] 00:29:57.993 }' 00:29:58.251 [2024-05-15 04:30:46.048759] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:58.251 [2024-05-15 04:30:46.048821] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4001544 ] 00:29:58.251 [2024-05-15 04:30:46.128739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:58.251 [2024-05-15 04:30:46.248705] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:29:58.767  Copying: 64/64 [kB] (average 20 MBps) 00:29:58.767 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:29:58.767 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:58.767 04:30:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:29:58.767 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.025 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:29:59.025 04:30:46 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@96 -- # update_stats 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode= 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.026 04:30:46 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:29:59.026 04:30:46 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:29:59.026 04:30:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.026 04:30:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.026 04:30:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:29:59.026 04:30:47 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:29:59.283 04:30:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:29:59.283 04:30:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:29:59.283 04:30:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.piGO0SA8Kc --ib Nvme0n1 --bs 65536 --count 1 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@25 -- # local config 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:29:59.283 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@31 -- # config='{ 00:29:59.283 "subsystems": [ 00:29:59.283 { 00:29:59.283 "subsystem": "bdev", 00:29:59.283 "config": [ 00:29:59.283 { 00:29:59.283 "method": "bdev_nvme_attach_controller", 00:29:59.283 "params": { 00:29:59.283 "trtype": "tcp", 00:29:59.283 "adrfam": "IPv4", 00:29:59.283 "name": "Nvme0", 00:29:59.283 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:29:59.283 "traddr": "10.0.0.2", 00:29:59.283 "trsvcid": "4420" 00:29:59.283 } 00:29:59.283 }, 00:29:59.283 { 00:29:59.283 "method": "bdev_set_options", 00:29:59.283 "params": { 00:29:59.283 "bdev_auto_examine": false 00:29:59.283 } 00:29:59.283 } 00:29:59.283 ] 00:29:59.283 } 00:29:59.283 ] 00:29:59.283 }' 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.piGO0SA8Kc --ib Nvme0n1 --bs 65536 --count 1 00:29:59.283 04:30:47 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:29:59.283 "subsystems": [ 00:29:59.283 { 00:29:59.283 "subsystem": "bdev", 00:29:59.283 "config": [ 00:29:59.283 { 00:29:59.283 "method": "bdev_nvme_attach_controller", 00:29:59.283 "params": { 00:29:59.283 "trtype": "tcp", 00:29:59.283 "adrfam": "IPv4", 00:29:59.283 "name": "Nvme0", 00:29:59.283 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:29:59.283 "traddr": "10.0.0.2", 00:29:59.283 "trsvcid": "4420" 00:29:59.283 } 00:29:59.283 }, 00:29:59.283 { 00:29:59.283 "method": "bdev_set_options", 00:29:59.283 "params": { 00:29:59.283 "bdev_auto_examine": false 00:29:59.283 } 00:29:59.283 } 00:29:59.283 ] 00:29:59.283 } 00:29:59.283 ] 00:29:59.283 }' 00:29:59.283 [2024-05-15 04:30:47.159733] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:29:59.283 [2024-05-15 04:30:47.159811] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4001714 ] 00:29:59.283 [2024-05-15 04:30:47.240109] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:59.541 [2024-05-15 04:30:47.356331] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:00.058  Copying: 64/64 [kB] (average 31 MBps) 00:30:00.058 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:00.058 04:30:47 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:00.058 04:30:47 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:00.058 04:30:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:00.058 04:30:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.fgdYC0JyCH /tmp/tmp.piGO0SA8Kc 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@25 -- # local config 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:00.344 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:00.344 "subsystems": [ 00:30:00.344 { 00:30:00.344 "subsystem": "bdev", 00:30:00.344 "config": [ 00:30:00.344 { 00:30:00.344 "method": "bdev_nvme_attach_controller", 00:30:00.344 "params": { 00:30:00.344 "trtype": "tcp", 00:30:00.344 "adrfam": "IPv4", 00:30:00.344 "name": "Nvme0", 00:30:00.344 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:00.344 "traddr": "10.0.0.2", 00:30:00.344 "trsvcid": "4420" 00:30:00.344 } 00:30:00.344 }, 00:30:00.344 { 00:30:00.344 "method": "bdev_set_options", 00:30:00.344 "params": { 00:30:00.344 "bdev_auto_examine": false 00:30:00.344 } 00:30:00.344 } 00:30:00.344 ] 00:30:00.344 } 00:30:00.344 ] 00:30:00.344 }' 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:30:00.344 04:30:48 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:00.344 "subsystems": [ 00:30:00.344 { 00:30:00.344 "subsystem": "bdev", 00:30:00.344 "config": [ 00:30:00.344 { 00:30:00.344 "method": "bdev_nvme_attach_controller", 00:30:00.344 "params": { 00:30:00.344 "trtype": "tcp", 00:30:00.344 "adrfam": "IPv4", 00:30:00.344 "name": "Nvme0", 00:30:00.344 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:00.344 "traddr": "10.0.0.2", 00:30:00.344 "trsvcid": "4420" 00:30:00.344 } 00:30:00.344 }, 00:30:00.344 { 00:30:00.344 "method": "bdev_set_options", 00:30:00.344 "params": { 00:30:00.344 "bdev_auto_examine": false 00:30:00.344 } 00:30:00.344 } 00:30:00.344 ] 00:30:00.344 } 00:30:00.344 ] 00:30:00.344 }' 00:30:00.344 [2024-05-15 04:30:48.182797] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:00.344 [2024-05-15 04:30:48.182870] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4001868 ] 00:30:00.344 [2024-05-15 04:30:48.262917] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:00.603 [2024-05-15 04:30:48.383656] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.119  Copying: 64/64 [kB] (average 15 MBps) 00:30:01.119 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@106 -- # update_stats 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:01.119 04:30:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:01.119 04:30:48 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:01.119 04:30:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:01.119 04:30:49 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:01.119 04:30:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:01.119 04:30:49 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.fgdYC0JyCH --ob Nvme0n1 --bs 4096 --count 16 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@25 -- # local config 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:01.119 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:01.119 "subsystems": [ 00:30:01.119 { 00:30:01.119 "subsystem": "bdev", 00:30:01.119 "config": [ 00:30:01.119 { 00:30:01.119 "method": "bdev_nvme_attach_controller", 00:30:01.119 "params": { 00:30:01.119 "trtype": "tcp", 00:30:01.119 "adrfam": "IPv4", 00:30:01.119 "name": "Nvme0", 00:30:01.119 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:01.119 "traddr": "10.0.0.2", 00:30:01.119 "trsvcid": "4420" 00:30:01.119 } 00:30:01.119 }, 00:30:01.119 { 00:30:01.119 "method": "bdev_set_options", 00:30:01.119 "params": { 00:30:01.119 "bdev_auto_examine": false 00:30:01.119 } 00:30:01.119 } 00:30:01.119 ] 00:30:01.119 } 00:30:01.119 ] 00:30:01.119 }' 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.fgdYC0JyCH --ob Nvme0n1 --bs 4096 --count 16 00:30:01.119 04:30:49 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:01.119 "subsystems": [ 00:30:01.119 { 00:30:01.119 "subsystem": "bdev", 00:30:01.119 "config": [ 00:30:01.119 { 00:30:01.119 "method": "bdev_nvme_attach_controller", 00:30:01.119 "params": { 00:30:01.119 "trtype": "tcp", 00:30:01.119 "adrfam": "IPv4", 00:30:01.119 "name": "Nvme0", 00:30:01.119 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:01.119 "traddr": "10.0.0.2", 00:30:01.119 "trsvcid": "4420" 00:30:01.119 } 00:30:01.119 }, 00:30:01.119 { 00:30:01.120 "method": "bdev_set_options", 00:30:01.120 "params": { 00:30:01.120 "bdev_auto_examine": false 00:30:01.120 } 00:30:01.120 } 00:30:01.120 ] 00:30:01.120 } 00:30:01.120 ] 00:30:01.120 }' 00:30:01.378 [2024-05-15 04:30:49.167557] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:01.378 [2024-05-15 04:30:49.167637] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4002026 ] 00:30:01.378 [2024-05-15 04:30:49.243276] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:01.378 [2024-05-15 04:30:49.364377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:02.202  Copying: 64/64 [kB] (average 8000 kBps) 00:30:02.202 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@114 -- # update_stats 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.202 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.202 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:02.460 04:30:50 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@117 -- # : 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.piGO0SA8Kc --ib Nvme0n1 --bs 4096 --count 16 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@25 -- # local config 00:30:02.460 04:30:50 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:30:02.461 04:30:50 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:30:02.461 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:30:02.461 04:30:50 chaining -- bdev/chaining.sh@31 -- # config='{ 00:30:02.461 "subsystems": [ 00:30:02.461 { 00:30:02.461 "subsystem": "bdev", 00:30:02.461 "config": [ 00:30:02.461 { 00:30:02.461 "method": "bdev_nvme_attach_controller", 00:30:02.461 "params": { 00:30:02.461 "trtype": "tcp", 00:30:02.461 "adrfam": "IPv4", 00:30:02.461 "name": "Nvme0", 00:30:02.461 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:02.461 "traddr": "10.0.0.2", 00:30:02.461 "trsvcid": "4420" 00:30:02.461 } 00:30:02.461 }, 00:30:02.461 { 00:30:02.461 "method": "bdev_set_options", 00:30:02.461 "params": { 00:30:02.461 "bdev_auto_examine": false 00:30:02.461 } 00:30:02.461 } 00:30:02.461 ] 00:30:02.461 } 00:30:02.461 ] 00:30:02.461 }' 00:30:02.461 04:30:50 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.piGO0SA8Kc --ib Nvme0n1 --bs 4096 --count 16 00:30:02.461 04:30:50 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:30:02.461 "subsystems": [ 00:30:02.461 { 00:30:02.461 "subsystem": "bdev", 00:30:02.461 "config": [ 00:30:02.461 { 00:30:02.461 "method": "bdev_nvme_attach_controller", 00:30:02.461 "params": { 00:30:02.461 "trtype": "tcp", 00:30:02.461 "adrfam": "IPv4", 00:30:02.461 "name": "Nvme0", 00:30:02.461 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:30:02.461 "traddr": "10.0.0.2", 00:30:02.461 "trsvcid": "4420" 00:30:02.461 } 00:30:02.461 }, 00:30:02.461 { 00:30:02.461 "method": "bdev_set_options", 00:30:02.461 "params": { 00:30:02.461 "bdev_auto_examine": false 00:30:02.461 } 00:30:02.461 } 00:30:02.461 ] 00:30:02.461 } 00:30:02.461 ] 00:30:02.461 }' 00:30:02.461 [2024-05-15 04:30:50.402897] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:02.461 [2024-05-15 04:30:50.402978] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4002193 ] 00:30:02.720 [2024-05-15 04:30:50.484101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:02.720 [2024-05-15 04:30:50.604398] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:03.236  Copying: 64/64 [kB] (average 688 kBps) 00:30:03.236 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:03.236 04:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:03.236 04:30:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.fgdYC0JyCH /tmp/tmp.piGO0SA8Kc 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.fgdYC0JyCH /tmp/tmp.piGO0SA8Kc 00:30:03.494 04:30:51 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@117 -- # sync 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@120 -- # set +e 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:30:03.494 rmmod nvme_tcp 00:30:03.494 rmmod nvme_fabrics 00:30:03.494 rmmod nvme_keyring 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@124 -- # set -e 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@125 -- # return 0 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@489 -- # '[' -n 4001356 ']' 00:30:03.494 04:30:51 chaining -- nvmf/common.sh@490 -- # killprocess 4001356 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@946 -- # '[' -z 4001356 ']' 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@950 -- # kill -0 4001356 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@951 -- # uname 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4001356 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4001356' 00:30:03.494 killing process with pid 4001356 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@965 -- # kill 4001356 00:30:03.494 [2024-05-15 04:30:51.389328] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:30:03.494 04:30:51 chaining -- common/autotest_common.sh@970 -- # wait 4001356 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:30:03.752 04:30:51 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:03.752 04:30:51 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:03.752 04:30:51 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:06.284 04:30:53 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:30:06.284 04:30:53 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:30:06.284 04:30:53 chaining -- bdev/chaining.sh@132 -- # bperfpid=4002642 00:30:06.284 04:30:53 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:06.284 04:30:53 chaining -- bdev/chaining.sh@134 -- # waitforlisten 4002642 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@827 -- # '[' -z 4002642 ']' 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:06.284 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.284 [2024-05-15 04:30:53.736933] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:06.284 [2024-05-15 04:30:53.737003] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4002642 ] 00:30:06.284 [2024-05-15 04:30:53.810935] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:06.284 [2024-05-15 04:30:53.921571] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@860 -- # return 0 00:30:06.284 04:30:53 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:06.284 04:30:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:06.284 malloc0 00:30:06.284 true 00:30:06.284 true 00:30:06.284 [2024-05-15 04:30:54.136063] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:06.284 crypto0 00:30:06.284 [2024-05-15 04:30:54.144085] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:06.284 crypto1 00:30:06.284 04:30:54 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:06.284 04:30:54 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:06.284 Running I/O for 5 seconds... 00:30:11.547 00:30:11.547 Latency(us) 00:30:11.547 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.547 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:11.547 Verification LBA range: start 0x0 length 0x2000 00:30:11.547 crypto1 : 5.02 12199.64 47.65 0.00 0.00 20930.18 5606.97 13786.83 00:30:11.547 =================================================================================================================== 00:30:11.547 Total : 12199.64 47.65 0.00 0.00 20930.18 5606.97 13786.83 00:30:11.547 0 00:30:11.547 04:30:59 chaining -- bdev/chaining.sh@146 -- # killprocess 4002642 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@946 -- # '[' -z 4002642 ']' 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@950 -- # kill -0 4002642 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@951 -- # uname 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4002642 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4002642' 00:30:11.547 killing process with pid 4002642 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@965 -- # kill 4002642 00:30:11.547 Received shutdown signal, test time was about 5.000000 seconds 00:30:11.547 00:30:11.547 Latency(us) 00:30:11.547 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.547 =================================================================================================================== 00:30:11.547 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:11.547 04:30:59 chaining -- common/autotest_common.sh@970 -- # wait 4002642 00:30:11.805 04:30:59 chaining -- bdev/chaining.sh@152 -- # bperfpid=4003319 00:30:11.805 04:30:59 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:11.805 04:30:59 chaining -- bdev/chaining.sh@154 -- # waitforlisten 4003319 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@827 -- # '[' -z 4003319 ']' 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:11.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:11.805 04:30:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:11.805 [2024-05-15 04:30:59.626254] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:11.805 [2024-05-15 04:30:59.626342] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4003319 ] 00:30:11.805 [2024-05-15 04:30:59.701883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:11.805 [2024-05-15 04:30:59.807957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:12.738 04:31:00 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:12.738 04:31:00 chaining -- common/autotest_common.sh@860 -- # return 0 00:30:12.738 04:31:00 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:30:12.738 04:31:00 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:12.738 04:31:00 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:12.738 malloc0 00:30:12.738 true 00:30:12.738 true 00:30:12.738 [2024-05-15 04:31:00.726002] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:30:12.738 [2024-05-15 04:31:00.726060] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:12.738 [2024-05-15 04:31:00.726087] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x978d30 00:30:12.738 [2024-05-15 04:31:00.726103] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:12.738 [2024-05-15 04:31:00.727155] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:12.738 [2024-05-15 04:31:00.727183] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:30:12.738 pt0 00:30:12.738 [2024-05-15 04:31:00.734030] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:12.738 crypto0 00:30:12.738 [2024-05-15 04:31:00.742048] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:30:12.738 crypto1 00:30:12.738 04:31:00 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:12.738 04:31:00 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:12.995 Running I/O for 5 seconds... 00:30:18.257 00:30:18.257 Latency(us) 00:30:18.257 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.257 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:18.257 Verification LBA range: start 0x0 length 0x2000 00:30:18.258 crypto1 : 5.01 10021.58 39.15 0.00 0.00 25479.18 964.84 16408.27 00:30:18.258 =================================================================================================================== 00:30:18.258 Total : 10021.58 39.15 0.00 0.00 25479.18 964.84 16408.27 00:30:18.258 0 00:30:18.258 04:31:05 chaining -- bdev/chaining.sh@167 -- # killprocess 4003319 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@946 -- # '[' -z 4003319 ']' 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@950 -- # kill -0 4003319 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@951 -- # uname 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4003319 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4003319' 00:30:18.258 killing process with pid 4003319 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@965 -- # kill 4003319 00:30:18.258 Received shutdown signal, test time was about 5.000000 seconds 00:30:18.258 00:30:18.258 Latency(us) 00:30:18.258 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:18.258 =================================================================================================================== 00:30:18.258 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:18.258 04:31:05 chaining -- common/autotest_common.sh@970 -- # wait 4003319 00:30:18.258 04:31:06 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:30:18.258 04:31:06 chaining -- bdev/chaining.sh@170 -- # killprocess 4003319 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@946 -- # '[' -z 4003319 ']' 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@950 -- # kill -0 4003319 00:30:18.258 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 950: kill: (4003319) - No such process 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@973 -- # echo 'Process with pid 4003319 is not found' 00:30:18.258 Process with pid 4003319 is not found 00:30:18.258 04:31:06 chaining -- bdev/chaining.sh@171 -- # wait 4003319 00:30:18.258 04:31:06 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:30:18.258 04:31:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@296 -- # e810=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@297 -- # x722=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@298 -- # mlx=() 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:82:00.0 (0x8086 - 0x159b)' 00:30:18.258 Found 0000:82:00.0 (0x8086 - 0x159b) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:82:00.1 (0x8086 - 0x159b)' 00:30:18.258 Found 0000:82:00.1 (0x8086 - 0x159b) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:82:00.0: cvl_0_0' 00:30:18.258 Found net devices under 0000:82:00.0: cvl_0_0 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:82:00.1: cvl_0_1' 00:30:18.258 Found net devices under 0000:82:00.1: cvl_0_1 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:30:18.258 04:31:06 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:30:18.516 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:30:18.516 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.138 ms 00:30:18.516 00:30:18.516 --- 10.0.0.2 ping statistics --- 00:30:18.516 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:18.516 rtt min/avg/max/mdev = 0.138/0.138/0.138/0.000 ms 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:30:18.516 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:30:18.516 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.089 ms 00:30:18.516 00:30:18.516 --- 10.0.0.1 ping statistics --- 00:30:18.516 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:30:18.516 rtt min/avg/max/mdev = 0.089/0.089/0.089/0.000 ms 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:30:18.516 04:31:06 chaining -- nvmf/common.sh@422 -- # return 0 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:30:18.517 04:31:06 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@720 -- # xtrace_disable 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@481 -- # nvmfpid=4004139 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:30:18.517 04:31:06 chaining -- nvmf/common.sh@482 -- # waitforlisten 4004139 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@827 -- # '[' -z 4004139 ']' 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:18.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:18.517 04:31:06 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:18.517 [2024-05-15 04:31:06.380457] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:18.517 [2024-05-15 04:31:06.380526] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:18.517 [2024-05-15 04:31:06.462502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:18.775 [2024-05-15 04:31:06.573018] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:30:18.775 [2024-05-15 04:31:06.573072] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:30:18.775 [2024-05-15 04:31:06.573103] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:30:18.775 [2024-05-15 04:31:06.573115] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:30:18.775 [2024-05-15 04:31:06.573125] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:30:18.775 [2024-05-15 04:31:06.573176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:30:19.339 04:31:07 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:19.339 04:31:07 chaining -- common/autotest_common.sh@860 -- # return 0 00:30:19.340 04:31:07 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:30:19.340 04:31:07 chaining -- common/autotest_common.sh@726 -- # xtrace_disable 00:30:19.340 04:31:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:19.340 04:31:07 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:30:19.340 04:31:07 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:30:19.340 04:31:07 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:19.340 04:31:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:19.598 malloc0 00:30:19.598 [2024-05-15 04:31:07.371798] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:30:19.598 [2024-05-15 04:31:07.387754] nvmf_rpc.c: 615:decode_rpc_listen_address: *WARNING*: decode_rpc_listen_address: deprecated feature [listen_]address.transport is deprecated in favor of trtype to be removed in v24.09 00:30:19.598 [2024-05-15 04:31:07.388063] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:19.598 04:31:07 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:30:19.598 04:31:07 chaining -- bdev/chaining.sh@189 -- # bperfpid=4004289 00:30:19.598 04:31:07 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:30:19.598 04:31:07 chaining -- bdev/chaining.sh@191 -- # waitforlisten 4004289 /var/tmp/bperf.sock 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@827 -- # '[' -z 4004289 ']' 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:30:19.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:19.598 04:31:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:19.598 [2024-05-15 04:31:07.450040] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:19.598 [2024-05-15 04:31:07.450126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4004289 ] 00:30:19.598 [2024-05-15 04:31:07.530762] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:19.856 [2024-05-15 04:31:07.648872] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:19.856 04:31:07 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:19.856 04:31:07 chaining -- common/autotest_common.sh@860 -- # return 0 00:30:19.856 04:31:07 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:30:19.856 04:31:07 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:30:20.421 [2024-05-15 04:31:08.149703] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:20.421 nvme0n1 00:30:20.421 true 00:30:20.421 crypto0 00:30:20.421 04:31:08 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:30:20.421 Running I/O for 5 seconds... 00:30:25.686 00:30:25.686 Latency(us) 00:30:25.686 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:25.686 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:30:25.686 Verification LBA range: start 0x0 length 0x2000 00:30:25.686 crypto0 : 5.02 9247.76 36.12 0.00 0.00 27599.56 4563.25 21456.97 00:30:25.686 =================================================================================================================== 00:30:25.686 Total : 9247.76 36.12 0.00 0.00 27599.56 4563.25 21456.97 00:30:25.686 0 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@205 -- # sequence=92834 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:25.686 04:31:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@206 -- # encrypt=46417 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:25.944 04:31:13 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@207 -- # decrypt=46417 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:26.201 04:31:14 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:30:26.458 04:31:14 chaining -- bdev/chaining.sh@208 -- # crc32c=92834 00:30:26.458 04:31:14 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:30:26.458 04:31:14 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:30:26.458 04:31:14 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:30:26.458 04:31:14 chaining -- bdev/chaining.sh@214 -- # killprocess 4004289 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@946 -- # '[' -z 4004289 ']' 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@950 -- # kill -0 4004289 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@951 -- # uname 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4004289 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4004289' 00:30:26.458 killing process with pid 4004289 00:30:26.458 04:31:14 chaining -- common/autotest_common.sh@965 -- # kill 4004289 00:30:26.458 Received shutdown signal, test time was about 5.000000 seconds 00:30:26.458 00:30:26.458 Latency(us) 00:30:26.459 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:26.459 =================================================================================================================== 00:30:26.459 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:26.459 04:31:14 chaining -- common/autotest_common.sh@970 -- # wait 4004289 00:30:26.716 04:31:14 chaining -- bdev/chaining.sh@219 -- # bperfpid=4005101 00:30:26.716 04:31:14 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:30:26.716 04:31:14 chaining -- bdev/chaining.sh@221 -- # waitforlisten 4005101 /var/tmp/bperf.sock 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@827 -- # '[' -z 4005101 ']' 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/bperf.sock 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@832 -- # local max_retries=100 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:30:26.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@836 -- # xtrace_disable 00:30:26.716 04:31:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:26.716 [2024-05-15 04:31:14.650556] Starting SPDK v24.05-pre git sha1 2dc74a001 / DPDK 23.11.0 initialization... 00:30:26.716 [2024-05-15 04:31:14.650624] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid4005101 ] 00:30:26.716 [2024-05-15 04:31:14.725435] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:26.976 [2024-05-15 04:31:14.832699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:30:26.976 04:31:14 chaining -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:30:26.976 04:31:14 chaining -- common/autotest_common.sh@860 -- # return 0 00:30:26.976 04:31:14 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:30:26.976 04:31:14 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:30:27.542 [2024-05-15 04:31:15.296305] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:30:27.543 nvme0n1 00:30:27.543 true 00:30:27.543 crypto0 00:30:27.543 04:31:15 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:30:27.543 Running I/O for 5 seconds... 00:30:32.809 00:30:32.809 Latency(us) 00:30:32.809 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:32.809 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:30:32.809 Verification LBA range: start 0x0 length 0x200 00:30:32.809 crypto0 : 5.01 1810.44 113.15 0.00 0.00 17377.26 1341.06 19223.89 00:30:32.809 =================================================================================================================== 00:30:32.809 Total : 1810.44 113.15 0.00 0.00 17377.26 1341.06 19223.89 00:30:32.809 0 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@233 -- # sequence=18126 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:30:32.809 04:31:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@234 -- # encrypt=9063 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:30:33.067 04:31:20 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@235 -- # decrypt=9063 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:30:33.325 04:31:21 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:30:33.582 04:31:21 chaining -- bdev/chaining.sh@236 -- # crc32c=18126 00:30:33.582 04:31:21 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:30:33.583 04:31:21 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:30:33.583 04:31:21 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:30:33.583 04:31:21 chaining -- bdev/chaining.sh@242 -- # killprocess 4005101 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@946 -- # '[' -z 4005101 ']' 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@950 -- # kill -0 4005101 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@951 -- # uname 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4005101 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4005101' 00:30:33.583 killing process with pid 4005101 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@965 -- # kill 4005101 00:30:33.583 Received shutdown signal, test time was about 5.000000 seconds 00:30:33.583 00:30:33.583 Latency(us) 00:30:33.583 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:33.583 =================================================================================================================== 00:30:33.583 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:33.583 04:31:21 chaining -- common/autotest_common.sh@970 -- # wait 4005101 00:30:33.841 04:31:21 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@117 -- # sync 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@120 -- # set +e 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:30:33.841 rmmod nvme_tcp 00:30:33.841 rmmod nvme_fabrics 00:30:33.841 rmmod nvme_keyring 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@124 -- # set -e 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@125 -- # return 0 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@489 -- # '[' -n 4004139 ']' 00:30:33.841 04:31:21 chaining -- nvmf/common.sh@490 -- # killprocess 4004139 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@946 -- # '[' -z 4004139 ']' 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@950 -- # kill -0 4004139 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@951 -- # uname 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 4004139 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@952 -- # process_name=reactor_1 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@956 -- # '[' reactor_1 = sudo ']' 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@964 -- # echo 'killing process with pid 4004139' 00:30:33.841 killing process with pid 4004139 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@965 -- # kill 4004139 00:30:33.841 [2024-05-15 04:31:21.833679] app.c:1024:log_deprecation_hits: *WARNING*: decode_rpc_listen_address: deprecation '[listen_]address.transport is deprecated in favor of trtype' scheduled for removal in v24.09 hit 1 times 00:30:33.841 04:31:21 chaining -- common/autotest_common.sh@970 -- # wait 4004139 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:30:34.405 04:31:22 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:30:34.405 04:31:22 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:30:34.405 04:31:22 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:30:36.346 04:31:24 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:30:36.346 04:31:24 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:30:36.346 00:30:36.346 real 0m42.146s 00:30:36.346 user 0m55.509s 00:30:36.346 sys 0m7.806s 00:30:36.346 04:31:24 chaining -- common/autotest_common.sh@1122 -- # xtrace_disable 00:30:36.346 04:31:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:30:36.346 ************************************ 00:30:36.346 END TEST chaining 00:30:36.346 ************************************ 00:30:36.346 04:31:24 -- spdk/autotest.sh@359 -- # [[ 0 -eq 1 ]] 00:30:36.346 04:31:24 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:30:36.346 04:31:24 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:30:36.346 04:31:24 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:30:36.346 04:31:24 -- spdk/autotest.sh@376 -- # trap - SIGINT SIGTERM EXIT 00:30:36.346 04:31:24 -- spdk/autotest.sh@378 -- # timing_enter post_cleanup 00:30:36.346 04:31:24 -- common/autotest_common.sh@720 -- # xtrace_disable 00:30:36.346 04:31:24 -- common/autotest_common.sh@10 -- # set +x 00:30:36.346 04:31:24 -- spdk/autotest.sh@379 -- # autotest_cleanup 00:30:36.346 04:31:24 -- common/autotest_common.sh@1388 -- # local autotest_es=0 00:30:36.346 04:31:24 -- common/autotest_common.sh@1389 -- # xtrace_disable 00:30:36.346 04:31:24 -- common/autotest_common.sh@10 -- # set +x 00:30:38.317 INFO: APP EXITING 00:30:38.317 INFO: killing all VMs 00:30:38.317 INFO: killing vhost app 00:30:38.317 INFO: EXIT DONE 00:30:39.250 Waiting for block devices as requested 00:30:39.250 0000:81:00.0 (8086 0a54): vfio-pci -> nvme 00:30:39.508 0000:00:04.7 (8086 0e27): vfio-pci -> ioatdma 00:30:39.508 0000:00:04.6 (8086 0e26): vfio-pci -> ioatdma 00:30:39.508 0000:00:04.5 (8086 0e25): vfio-pci -> ioatdma 00:30:39.766 0000:00:04.4 (8086 0e24): vfio-pci -> ioatdma 00:30:39.766 0000:00:04.3 (8086 0e23): vfio-pci -> ioatdma 00:30:39.766 0000:00:04.2 (8086 0e22): vfio-pci -> ioatdma 00:30:39.766 0000:00:04.1 (8086 0e21): vfio-pci -> ioatdma 00:30:39.766 0000:00:04.0 (8086 0e20): vfio-pci -> ioatdma 00:30:40.024 0000:80:04.7 (8086 0e27): vfio-pci -> ioatdma 00:30:40.024 0000:80:04.6 (8086 0e26): vfio-pci -> ioatdma 00:30:40.024 0000:80:04.5 (8086 0e25): vfio-pci -> ioatdma 00:30:40.282 0000:80:04.4 (8086 0e24): vfio-pci -> ioatdma 00:30:40.282 0000:80:04.3 (8086 0e23): vfio-pci -> ioatdma 00:30:40.282 0000:80:04.2 (8086 0e22): vfio-pci -> ioatdma 00:30:40.282 0000:80:04.1 (8086 0e21): vfio-pci -> ioatdma 00:30:40.540 0000:80:04.0 (8086 0e20): vfio-pci -> ioatdma 00:30:41.911 Cleaning 00:30:41.911 Removing: /var/run/dpdk/spdk0/config 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:30:41.911 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:30:41.911 Removing: /var/run/dpdk/spdk0/hugepage_info 00:30:41.911 Removing: /dev/shm/nvmf_trace.0 00:30:41.911 Removing: /dev/shm/spdk_tgt_trace.pid3788641 00:30:41.911 Removing: /var/run/dpdk/spdk0 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3786030 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3787694 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3788641 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3789084 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3789777 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3790040 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3790759 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3790825 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3791066 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3793574 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3795026 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3795333 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3795600 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3795858 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3796179 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3796342 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3796494 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3796801 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3797101 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3799976 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3800208 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3800447 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3800631 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3800771 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3800845 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3801113 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3801280 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3801492 00:30:41.911 Removing: /var/run/dpdk/spdk_pid3801709 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3801877 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3802143 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3802306 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3802463 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3802737 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3802894 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3803097 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3803325 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3803492 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3803760 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3803917 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3804080 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3804355 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3804519 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3804712 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3804949 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3805168 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3805407 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3805686 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3805970 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3806252 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3806537 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3806823 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3807111 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3807294 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3807515 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3807934 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3808314 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3808468 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3812030 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3813589 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3815009 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3815840 00:30:41.912 Removing: /var/run/dpdk/spdk_pid3816789 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3817070 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3817102 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3817245 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3820670 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3821201 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3822643 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3822818 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3827568 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3831840 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3836123 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3846035 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3856394 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3866273 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3878924 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3890150 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3901856 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3905300 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3908059 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3912635 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3914848 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3918877 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3921898 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3927483 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3929980 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3935864 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3938033 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3943623 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3945782 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3951985 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3954036 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3957946 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3958333 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3958607 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3959005 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3959362 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3960015 00:30:42.169 Removing: /var/run/dpdk/spdk_pid3960631 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3960979 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3962588 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3964196 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3965809 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3967074 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3968686 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3970289 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3972510 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3973780 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3974326 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3974742 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3976699 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3978475 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3980141 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3981089 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3982156 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3982699 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3982844 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3982909 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3983068 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3983217 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3984321 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3985721 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3987114 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3987795 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3988474 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3988752 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3988775 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3988805 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3989655 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3990201 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3990626 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3992570 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3994353 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3996601 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3997562 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3998632 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3999177 00:30:42.170 Removing: /var/run/dpdk/spdk_pid3999198 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4001544 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4001714 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4001868 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4002026 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4002193 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4002642 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4003319 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4004289 00:30:42.170 Removing: /var/run/dpdk/spdk_pid4005101 00:30:42.170 Clean 00:30:42.170 04:31:30 -- common/autotest_common.sh@1447 -- # return 0 00:30:42.170 04:31:30 -- spdk/autotest.sh@380 -- # timing_exit post_cleanup 00:30:42.170 04:31:30 -- common/autotest_common.sh@726 -- # xtrace_disable 00:30:42.170 04:31:30 -- common/autotest_common.sh@10 -- # set +x 00:30:42.427 04:31:30 -- spdk/autotest.sh@382 -- # timing_exit autotest 00:30:42.427 04:31:30 -- common/autotest_common.sh@726 -- # xtrace_disable 00:30:42.427 04:31:30 -- common/autotest_common.sh@10 -- # set +x 00:30:42.427 04:31:30 -- spdk/autotest.sh@383 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:30:42.427 04:31:30 -- spdk/autotest.sh@385 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:30:42.427 04:31:30 -- spdk/autotest.sh@385 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:30:42.427 04:31:30 -- spdk/autotest.sh@387 -- # hash lcov 00:30:42.427 04:31:30 -- spdk/autotest.sh@387 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:30:42.427 04:31:30 -- spdk/autotest.sh@389 -- # hostname 00:30:42.427 04:31:30 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-gp-12 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:30:42.427 geninfo: WARNING: invalid characters removed from testname! 00:31:14.483 04:32:01 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:17.754 04:32:05 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:20.277 04:32:08 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:23.551 04:32:10 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:26.075 04:32:13 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:28.599 04:32:16 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:31:31.125 04:32:19 -- spdk/autotest.sh@396 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:31:31.125 04:32:19 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:31.125 04:32:19 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:31:31.125 04:32:19 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:31.125 04:32:19 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:31.125 04:32:19 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:31.125 04:32:19 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:31.125 04:32:19 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:31.125 04:32:19 -- paths/export.sh@5 -- $ export PATH 00:31:31.125 04:32:19 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:31.125 04:32:19 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:31.125 04:32:19 -- common/autobuild_common.sh@437 -- $ date +%s 00:31:31.125 04:32:19 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1715740339.XXXXXX 00:31:31.125 04:32:19 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1715740339.obUXoC 00:31:31.125 04:32:19 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:31:31.125 04:32:19 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:31:31.125 04:32:19 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:31:31.125 04:32:19 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:31:31.125 04:32:19 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:31:31.125 04:32:19 -- common/autobuild_common.sh@453 -- $ get_config_params 00:31:31.125 04:32:19 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:31:31.125 04:32:19 -- common/autotest_common.sh@10 -- $ set +x 00:31:31.383 04:32:19 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:31:31.383 04:32:19 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:31:31.383 04:32:19 -- pm/common@17 -- $ local monitor 00:31:31.383 04:32:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:31.383 04:32:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:31.383 04:32:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:31.383 04:32:19 -- pm/common@21 -- $ date +%s 00:31:31.383 04:32:19 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:31.383 04:32:19 -- pm/common@21 -- $ date +%s 00:31:31.383 04:32:19 -- pm/common@25 -- $ sleep 1 00:31:31.383 04:32:19 -- pm/common@21 -- $ date +%s 00:31:31.383 04:32:19 -- pm/common@21 -- $ date +%s 00:31:31.383 04:32:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715740339 00:31:31.383 04:32:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715740339 00:31:31.383 04:32:19 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715740339 00:31:31.383 04:32:19 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1715740339 00:31:31.383 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715740339_collect-vmstat.pm.log 00:31:31.383 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715740339_collect-cpu-load.pm.log 00:31:31.383 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715740339_collect-cpu-temp.pm.log 00:31:31.383 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1715740339_collect-bmc-pm.bmc.pm.log 00:31:32.367 04:32:20 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:31:32.367 04:32:20 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j48 00:31:32.367 04:32:20 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.367 04:32:20 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:31:32.367 04:32:20 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:31:32.367 04:32:20 -- spdk/autopackage.sh@19 -- $ timing_finish 00:31:32.367 04:32:20 -- common/autotest_common.sh@732 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:31:32.367 04:32:20 -- common/autotest_common.sh@733 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:31:32.367 04:32:20 -- common/autotest_common.sh@735 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:31:32.367 04:32:20 -- spdk/autopackage.sh@20 -- $ exit 0 00:31:32.367 04:32:20 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:31:32.367 04:32:20 -- pm/common@29 -- $ signal_monitor_resources TERM 00:31:32.367 04:32:20 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:31:32.367 04:32:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:32.367 04:32:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:31:32.367 04:32:20 -- pm/common@44 -- $ pid=4016252 00:31:32.367 04:32:20 -- pm/common@50 -- $ kill -TERM 4016252 00:31:32.367 04:32:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:32.367 04:32:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:31:32.367 04:32:20 -- pm/common@44 -- $ pid=4016254 00:31:32.367 04:32:20 -- pm/common@50 -- $ kill -TERM 4016254 00:31:32.367 04:32:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:32.367 04:32:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:31:32.367 04:32:20 -- pm/common@44 -- $ pid=4016256 00:31:32.367 04:32:20 -- pm/common@50 -- $ kill -TERM 4016256 00:31:32.367 04:32:20 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:31:32.367 04:32:20 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:31:32.367 04:32:20 -- pm/common@44 -- $ pid=4016292 00:31:32.367 04:32:20 -- pm/common@50 -- $ sudo -E kill -TERM 4016292 00:31:32.367 + [[ -n 3692866 ]] 00:31:32.367 + sudo kill 3692866 00:31:32.379 [Pipeline] } 00:31:32.402 [Pipeline] // stage 00:31:32.407 [Pipeline] } 00:31:32.425 [Pipeline] // timeout 00:31:32.431 [Pipeline] } 00:31:32.447 [Pipeline] // catchError 00:31:32.452 [Pipeline] } 00:31:32.465 [Pipeline] // wrap 00:31:32.471 [Pipeline] } 00:31:32.488 [Pipeline] // catchError 00:31:32.495 [Pipeline] stage 00:31:32.498 [Pipeline] { (Epilogue) 00:31:32.511 [Pipeline] catchError 00:31:32.513 [Pipeline] { 00:31:32.524 [Pipeline] echo 00:31:32.525 Cleanup processes 00:31:32.530 [Pipeline] sh 00:31:32.822 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.823 4016418 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:31:32.823 4016522 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:32.839 [Pipeline] sh 00:31:33.121 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:33.121 ++ grep -v 'sudo pgrep' 00:31:33.121 ++ awk '{print $1}' 00:31:33.121 + sudo kill -9 4016418 00:31:33.133 [Pipeline] sh 00:31:33.412 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:31:43.444 [Pipeline] sh 00:31:43.729 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:31:43.729 Artifacts sizes are good 00:31:43.742 [Pipeline] archiveArtifacts 00:31:43.750 Archiving artifacts 00:31:43.909 [Pipeline] sh 00:31:44.189 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:31:44.207 [Pipeline] cleanWs 00:31:44.218 [WS-CLEANUP] Deleting project workspace... 00:31:44.218 [WS-CLEANUP] Deferred wipeout is used... 00:31:44.224 [WS-CLEANUP] done 00:31:44.228 [Pipeline] } 00:31:44.248 [Pipeline] // catchError 00:31:44.263 [Pipeline] sh 00:31:44.572 + logger -p user.info -t JENKINS-CI 00:31:44.582 [Pipeline] } 00:31:44.600 [Pipeline] // stage 00:31:44.606 [Pipeline] } 00:31:44.624 [Pipeline] // node 00:31:44.630 [Pipeline] End of Pipeline 00:31:44.663 Finished: SUCCESS